LiteLLM Example Configs #1038
Replies: 9 comments 14 replies
-
ahhhh yes ty! |
Beta Was this translation helpful? Give feedback.
-
this is amazing @justinh-rahb - if you have any feedback for how we can improve our own docs for this, let me know - https://docs.litellm.ai/docs/
|
Beta Was this translation helpful? Give feedback.
-
No, in v0.1.115 (latest version) open-webui still cannot set LiteLLM to make any Claude 3 model work. Unless... Upgrade LiteLLM to the latest version v1.34.12.
Replace litellm==1.30.7 with litellm==1.34.12 in the ./backend/requirements.txt file.
Then use the locally created docker image ghcr.io/open-webui/open-webui:latest. You can use the Claude 3 models added by LiteLLM. |
Beta Was this translation helpful? Give feedback.
-
Delete the local open-webui images, After restarting using ghcr.io/open-webui/open-webui:main, I can confirm that the newly added claude 3 model can run correctly.
Your response and reminder are greatly appreciated. Thank you. @justinh-rahb |
Beta Was this translation helpful? Give feedback.
-
Thanks a lot for the elements you shared. I am facing a difficulty in setting the access of open webui to litellm.
|
Beta Was this translation helpful? Give feedback.
-
Hi, thanks for the support! But I'm not clear how to set it up on OpenWebUI. It there a step-by-step tutorial somewhere? :) |
Beta Was this translation helpful? Give feedback.
-
How to get the LiteLLM API key? What is the LiteLLM API base URL? Are we supposed to install LiteLLM before adding values here? How to install that and get it connected with Openwebui? WHat value to give for RPM? |
Beta Was this translation helpful? Give feedback.
-
Dear all,
If I do understand,open webui comes with its own litellm. It is built
inside.
Perhaps they could allow at term to reference an external litellm, a better
practice.
François
Le sam. 25 mai 2024, 11:49, Ravishankar Ayyakkannu ***@***.***>
a écrit :
… How to get the LiteLLM API ket? What is the LiteLLM API base URL? Are we
supposed to install LiteLLM before adding values here? How to install that
and get it connected with Openwebui? WHat value to give for RPM?
—
Reply to this email directly, view it on GitHub
<#1038 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKZRFFWWCPJ7N2RSYWBY6DZEBNBRAVCNFSM6AAAAABEF46NNOVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TKNJUG42DS>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Just wanted to dump some configs for common endpoints for future reference.
Note
Other than official OpenAI-endpoints, LiteLLM usually requires that an
provider
be specified before themodel string
in theAdd a model
field, example:Model Name
can be whatever you want it to appear as in your list.API Base URL
typically only needs to be set for OpenAI-compatible APIs and Azure.Warning
Gemini Endpoint:
ONLY Makersuite/AI Studio API keys are supported, VertexAI/GCP endpoints and authentication methods are NOT currently supported.
Anthropic
Model strings:
claude-2
claude-2.1
claude-instant-1.2
claude-3-sonnet-20240229
claude-3-opus-20240229
claude-3-haiku-20240307
Note
Anthropic's API requires that
max_tokens
parameter be sent in the payload, the maximum accepted value is4096
Claude 2.1
Claude 3 "Sonnet"
Claude 3 "Opus"
Groq
Model strings:
mixtral-8x7b-32768
llama2-70b-4096
Mixtral 8x7B
Llama2 70B
Google Gemini
Model strings:
gemini-pro
Gemini Pro
Mistral
Model strings:
open-mistral-7b
(akamistral-tiny-2312
)open-mixtral-8x7b
(akamistral-small-2312
)mistral-small-latest
(akamistral-small-2402
)mistral-medium-latest
(akamistral-medium-2312
)mistral-large-latest
(akamistral-large-2402
)Open Mixtral 8x7B (formerly
mistral-small
)Mistral Medium
Mistral Large
Azure OpenAI
Model strings:
gpt35turbo
gpt4
GPT 3.5 Turbo
Caution
OpenAI and Ollama Endpoints
Unless there is a specific reason why the Connections > Ollama or Connections > OpenAI methods are not working for your use-case, the following methods are NOT recommended to be used instead:
OpenAI
Model strings:
gpt-3.5-turbo
gpt-4
gpt-4-turbo-preview
gpt-4-vision-preview
GPT 4 Turbo
"OpenAI-compatible" endpoints
Ollama
Llama2
Beta Was this translation helpful? Give feedback.
All reactions