We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is your feature request related to a problem? Please describe. When I try to create a model file with a model that is store not on the ollama server but on the litellm server in my openwebui configuartion files, the oepn webui interface try to download it "pull model manifest: Get "https://ollama.com/token?nonce=PVU8K_tursrhgMstjAboFw&scope=repository%3Alibrary%2Fmixtral-8x7b%3Apull&service=ollama.com&ts=1714223332": EOF"
i do confirm the litellm model is accessible through the open webui in my case I am using litellm > vllm > mixtral 8x7b
Describe the solution you'd like It should be possible to create a model file for a model accessible through the litellm server.
The text was updated successfully, but these errors were encountered:
Duplicate #665
Sorry, something went wrong.
No branches or pull requests
Is your feature request related to a problem? Please describe.
When I try to create a model file with a model that is store not on the ollama server but on the litellm server in my openwebui configuartion files, the oepn webui interface try to download it "pull model manifest: Get "https://ollama.com/token?nonce=PVU8K_tursrhgMstjAboFw&scope=repository%3Alibrary%2Fmixtral-8x7b%3Apull&service=ollama.com&ts=1714223332": EOF"
i do confirm the litellm model is accessible through the open webui
in my case I am using litellm > vllm > mixtral 8x7b
Describe the solution you'd like
It should be possible to create a model file for a model accessible through the litellm server.
The text was updated successfully, but these errors were encountered: