Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supported LLM: Azure OpenAI? #52

Open
jagadishdoki opened this issue Dec 15, 2023 · 3 comments
Open

Supported LLM: Azure OpenAI? #52

jagadishdoki opened this issue Dec 15, 2023 · 3 comments

Comments

@jagadishdoki
Copy link

You have indicated that ChatGPT-Next-Web project was used as a starter template for this project. Can you please confirm if LlamaIndex Chat support Azure OpenAI?

If yes, please provide the instructions to switch to Azure OpenAI.
If no, will this be treated as feature enhancement? Is there a quick way to make this switch to use Azure OpenAI?

Content of .env.development.local file

Your openai api key. (required)

OPENAI_API_KEY=sk-xxxx

@marcusschiesser
Copy link
Collaborator

Chat LlamaIndex can use any LLM that is supported by LlamaIndexTS, you just have to plug it in here:

const llm = new OpenAI({
model: config.model,
temperature: config.temperature,
topP: config.topP,
maxTokens: config.maxTokens,
});

@destinychanger
Copy link

Chat LlamaIndex can use any LLM that is supported by LlamaIndexTS, you just have to plug it in here:

const llm = new OpenAI({
model: config.model,
temperature: config.temperature,
topP: config.topP,
maxTokens: config.maxTokens,
});

Hi,
Below are the config I have added as asked:
azure: {
apiKey: "xxxxxxxxx",
endpoint: "https://azureopenaillamaindex.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-07-01-preview",
apiVersion: "2023-07-01-preview",
deploymentName:"gpt-35-turbo" ,
}

And I am getting below error:

[LlamaIndex] BadRequestError: 400 Unsupported data type

   at APIError.generate (C:\LAAMA\chat-llamaindex\node_modules\openai\error.js:43:20)
   at AzureOpenAI.makeStatusError (C:\LAAMA\chat-llamaindex\node_modules\openai\core.js:252:33)
   at AzureOpenAI.makeRequest (C:\LAAMA\chat-llamaindex\node_modules\openai\core.js:293:30)
   at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
status: 400,
headers: {
   'apim-request-id': 'df10be42-1577-481f-b65e-d1abb4f43bc9',
   'content-length': '22',
   'content-type': 'text/plain; charset=utf-8',
   date: 'Tue, 19 Dec 2023 07:14:23 GMT',
   'ms-azureml-model-error-reason': 'model_error',
   'ms-azureml-model-error-statuscode': '400',
   'strict-transport-security': 'max-age=31536000; includeSubDomains; preload',
   'x-content-type-options': 'nosniff',
   'x-ms-client-request-id': 'df10be42-1577-481f-b65e-d1abb4f43bc9',
   'x-ms-region': 'East US',
   'x-ratelimit-remaining-requests': '119',
   'x-ratelimit-remaining-tokens': '119984',
   'x-request-id': '23112d7f-5f0f-4d28-8673-a45c99c69cd3'
},
error: undefined,
code: undefined,
param: undefined,
type: undefined

And on UI it shows:
llama error

Can you please help me suspecting what could be the issue.

@marcusschiesser
Copy link
Collaborator

Sorry, currently we're not having an azure example. I would start using this example https://github.com/run-llama/LlamaIndexTS/blob/main/examples/openai.ts and modify the parameters

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants