New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to run chatbot-rag-app using Azure OpenAI model #213
Comments
The error that you get is this:
This would suggest that the configuration values that you provided are incorrect. I have just re-tested Azure to confirm that it works. These are my settings:
Do you want to try with these? I'm not sure if the version and URL are current or out of date, but you can play with the settings after you get it to work. |
closing this for now. feel free to re-open if you still unable to resolve it :) |
Ok, looking to reopen this: I've had some time to re-attempt this and I'm still unable to get the azure openai mode to work here's my .env LLM env vars: LLM_TYPE=azure Here's the model resource details from the azure-cli: |
Hey |
I attempted to run the chatbot-rag-app using an Azure OpenAI model, and found that when I entered a question in the prompt box in the GUI, the service would never produce a response.
I was able to successfully use the docker run --rm --env-file .env chatbot-rag-app flask create-index to index the sample documents to my Elasticsearch cluster, so all ES configuration elements are good.
I assumed I was passing a bad var value in the .env file, so I:
Attached are a redacted copy of my .env file (see above for the additional variations I tried), as well as the docker logs from a container instance where this behavior occurred
redacted_dot_env_file.txt
azureopenai_error_logs.txt
The text was updated successfully, but these errors were encountered: