Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to load open AI models in my docker container [BUG] #2358

Closed
Aaryaveerkrishna23 opened this issue May 8, 2024 · 1 comment · Fixed by #2416
Closed

Unable to load open AI models in my docker container [BUG] #2358

Aaryaveerkrishna23 opened this issue May 8, 2024 · 1 comment · Fixed by #2416
Labels
setup Setup related issues

Comments

@Aaryaveerkrishna23
Copy link

Even after setting my HTTP and HTTPS proxies in the .env file in my docker conatiner, I am unable to load OpenAI models into Flowise's UI in the OpenAI ChatModel section.

@HenryHengZJ
Copy link
Contributor

Flowise fetches like models list from here

You can modify the env variable to specify where to fetch the file - https://docs.flowiseai.com/configuration/environment-variables#models

@HenryHengZJ HenryHengZJ added the setup Setup related issues label May 13, 2024
@HenryHengZJ HenryHengZJ linked a pull request May 15, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
setup Setup related issues
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants