Skip to content

Releases: BerriAI/litellm

v1.39.4

30 May 15:48
Compare
Choose a tag to compare

What's Changed

  • fix - UI submit chat on enter by @ishaan-jaff in #3916
  • Revert "Revert "fix: Log errors in Traceloop Integration (reverts previous revert)"" by @nirga in #3909

Full Changelog: v1.39.3...v1.39.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 135.98662418243552 6.404889633803229 0.0 1913 0 97.80563699996492 1663.1231360000243
Aggregated Passed ✅ 120.0 135.98662418243552 6.404889633803229 0.0 1913 0 97.80563699996492 1663.1231360000243

v1.39.3

30 May 04:26
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.39.2...v1.39.3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 110.0 133.96143579083153 6.347194412767075 0.0 1898 0 91.88108999995848 1459.6432470000025
Aggregated Passed ✅ 110.0 133.96143579083153 6.347194412767075 0.0 1898 0 91.88108999995848 1459.6432470000025

v1.39.2

29 May 06:53
Compare
Choose a tag to compare

What's Changed

pika-1716961715848-1x

Full Changelog: v1.38.12...v1.39.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 72 83.46968387564114 6.529958043991633 0.0 1954 0 61.38368400002037 678.4462749999989
Aggregated Passed ✅ 72 83.46968387564114 6.529958043991633 0.0 1954 0 61.38368400002037 678.4462749999989

v1.38.12

28 May 15:54
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.38.11...v1.38.12

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.12

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 76 91.16258395147193 6.473952425752436 0.0 1937 0 62.406538999994154 1772.6057410000067
Aggregated Passed ✅ 76 91.16258395147193 6.473952425752436 0.0 1937 0 62.406538999994154 1772.6057410000067

v1.38.11

28 May 03:25
4b0a8ff
Compare
Choose a tag to compare

💵 LiteLLM v1.38.11 Proxy 100+ LLMs AND Set Budgets for your customers https://docs.litellm.ai/docs/proxy/users#set-rate-limits

✨ NEW /Customer/update and /Customer/delete endpoints https://docs.litellm.ai/docs/proxy/users#set-rate-limits

📝 [Feat] Email alerting is now Free Tier: https://docs.litellm.ai/docs/proxy/email

🚀 [Feat] Show supported OpenAI params on LiteLLM UI model hub

✨ [Feat] Show Created at, Created by on Models Page

codeimage-snippet_28 (3)

What's Changed

New Contributors

Full Changelog: v1.38.10...v1.38.11

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.11

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 94 113.13091035154665 6.485092627447978 0.0 1940 0 80.4994959999874 735.4111310000064
Aggregated Passed ✅ 94 113.13091035154665 6.485092627447978 0.0 1940 0 80.4994959999874 735.4111310000064

v1.38.10

26 May 22:48
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.38.8...v1.38.10

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.10

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 152.41971991092666 6.452763997233594 0.0 1931 0 108.63601500000186 1150.9651800000142
Aggregated Passed ✅ 130.0 152.41971991092666 6.452763997233594 0.0 1931 0 108.63601500000186 1150.9651800000142

v1.38.8-stable

26 May 07:03
Compare
Choose a tag to compare

Full Changelog: v1.38.8...v1.38.8-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.8-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 73 85.31436445193742 6.640342227407584 0.0 1987 0 61.23339800001304 1299.6820050000224
Aggregated Passed ✅ 73 85.31436445193742 6.640342227407584 0.0 1987 0 61.23339800001304 1299.6820050000224

v1.38.8

26 May 05:49
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.38.7...v1.38.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 74 86.12069489644486 6.487708071155493 0.0 1941 0 62.97004400005335 733.9951239999891
Aggregated Passed ✅ 74 86.12069489644486 6.487708071155493 0.0 1941 0 62.97004400005335 733.9951239999891

v1.38.7-stable

26 May 03:29
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.38.5...v1.38.7-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.7-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 96 117.54187770999512 6.456232729004693 0.0 1931 0 80.74312700000519 802.6662359999932
Aggregated Passed ✅ 96 117.54187770999512 6.456232729004693 0.0 1931 0 80.74312700000519 802.6662359999932

v1.38.7

26 May 01:44
Compare
Choose a tag to compare

😇 LiteLLM v1.38.7 - New Activity Tab, Track LLM API Requests & Total Tokens 👉 Start here: https://github.com/BerriAI/litellm

🔥 [Fix] Set budget_duration on /team/new and /team/update

🔥 [Feat] Supporting for Resetting Team Budgets on budget_reset_at https://docs.litellm.ai/docs/proxy/users

⚒️ [Feature]: Attach litellm exception in error string - ContentPolicyViolation, AuthenticationError

📧 [Docs]- setting up Email notifications https://docs.litellm.ai/docs/proxy/email

pika-1716692441692-1x

What's Changed

Full Changelog: v1.38.5...v1.38.7

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 110.0 127.76486134384693 6.465849619551454 0.0 1934 0 97.91651000000456 1353.8686059999918
Aggregated Passed ✅ 110.0 127.76486134384693 6.465849619551454 0.0 1934 0 97.91651000000456 1353.8686059999918