Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: adding support for other LLM providers with litellm #14

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

krrishdholakia
Copy link

@krrishdholakia krrishdholakia commented Oct 4, 2023

Hi @ErikBjare,

Noticed you're calling GPT-4 + local models via llama cpp. Wanted to see if we could help with LiteLLM (https://github.com/BerriAI/litellm).

Replaced the openai chatcompletions call with litellm.completion.

This should enable gptme to support Anthropic, Bedrock, AI21, TogetherAI, Azure, etc.

Curious - i know you've made a PR into LiteLLM before. What was litellm missing to be useful to you? Any feedback here would be helpful.

@ErikBjare
Copy link
Owner

Hi @krrishdholakia!

i know you've made a PR into LiteLLM before

The PR I made was just a thing I saw missing, I'm not actually using litellm myself! (congratulations on the popularity!)

What was litellm missing to be useful to you?

To be completely honest: I just don't see why?

I try to avoid extra dependencies whenever possible. When looking at your pyproject.toml, there are a few things there I don't like/want. Seems, for me, litellm just wraps openai, and I don't care about anything except OpenAI-compatible APIs.

I guess it's just not for me right now. Maybe in the future.

I wish you the best and continued success! 🚀

P.S. when checking out your profile I found https://github.com/BerriAI/liteLLM-proxy which looks great! Definitely going to try that out.

@ErikBjare ErikBjare closed this Oct 10, 2023
@krrishdholakia
Copy link
Author

Hey @ErikBjare

Thanks for the feedback.

When looking at your pyproject.toml, there are a few things there I don't like/want.

What did you not like/want? Feedback here is great!

I found https://github.com/BerriAI/liteLLM-proxy which looks great! Definitely going to try that out.

Interesting - what problem does it solve for you?

@ErikBjare ErikBjare mentioned this pull request Jan 19, 2024
@ErikBjare ErikBjare reopened this Jan 20, 2024
@ErikBjare ErikBjare changed the title adding support for new models feat: adding support for other LLM providers with litellm Jan 20, 2024
@ErikBjare
Copy link
Owner

ErikBjare commented Jan 20, 2024

Stumbled into a bunch of issues, leading to me creating BerriAI/litellm#1537

In the process, I learned that Azure OpenAI is directly supported by openai-python. So might not have a need for litellm after all.

So I decided to open a seperate PR with just the openai v1.0 changes and go from there: #65

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants