Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 93

1.47.1

What's Changed
* [Feat] Add fireworks AI embedding by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5812
* Correct casing by superpoussin22 in https://github.com/BerriAI/litellm/pull/5817
* Fixed DeepSeek input and output tokens by Columpio in https://github.com/BerriAI/litellm/pull/5718
* [Feat] Allow setting `supports_vision` for Custom OpenAI endpoints + Added testing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5821
* [Feat] Add testing for prometheus failure metrics by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5823
* [Feat] Prometheus - show status code and class type on prometheus by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5806
* [Feat] Allow setting custom arize endpoint by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5709
* [SSO-UI] Set new sso users as internal_view role users by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5824
* Fix premium user check on key creation by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5826
* [fix-sso] Allow internal user viewer to view usage routes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5825
* [Fix] virtual key auth checks on vertex ai pass through endpoints by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5827

New Contributors
* superpoussin22 made their first contribution in https://github.com/BerriAI/litellm/pull/5817
* Columpio made their first contribution in https://github.com/BerriAI/litellm/pull/5718

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.47.0...v1.47.1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.47.1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 176.43602477377166 | 6.4003682679078615 | 0.0 | 1914 | 0 | 117.87740500000154 | 2631.672829999957 |
| Aggregated | Passed ✅ | 140.0 | 176.43602477377166 | 6.4003682679078615 | 0.0 | 1914 | 0 | 117.87740500000154 | 2631.672829999957 |

1.47.0

🚨 DB Schema Update
This release contains an update to the LiteLLM_VerificationTokenTable in your db. It adds the `blocked` column. This is to enable blocking/unblocking virtual keys - https://github.com/BerriAI/litellm/issues/5328

https://github.com/BerriAI/litellm/blob/4069942dd8c0e73dcd015eeb271ef6753148dfa7/schema.prisma#L142

What's Changed
* ui fix correct team not loading by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5804
* LiteLLM Minor Fixes & Improvements (09/19/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5793
* [Fix] Tag Based Routing not work with wildcard routing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5805
* [Fix] log update_db statement in .debug() mode by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5810
* [Feat-Proxy] Allow using custom sso handler by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5809
* refactor: cleanup root of repo by krrishdholakia in https://github.com/BerriAI/litellm/pull/5813
* LiteLLM Minor Fixes & Improvements (09/20/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5807


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.46.8...v1.47.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.47.0



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 88 | 100.03539544310269 | 6.370704723833327 | 0.0 | 1907 | 0 | 70.29187699998829 | 929.6467599999687 |
| Aggregated | Passed ✅ | 88 | 100.03539544310269 | 6.370704723833327 | 0.0 | 1907 | 0 | 70.29187699998829 | 929.6467599999687 |

1.46.8

🚨 Known issue on Proxy Admin UI - all teams do not load. Fixed here: https://github.com/BerriAI/litellm/pull/5804

What's Changed
* [Feat] Add proxy level prometheus metrics by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5789
* [ Proxy - User Management]: If user assigned to a team don't show Default Team by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5791
* [Feat] Add Error Handling for /key/list endpoint by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5787


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.46.7...v1.46.8



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.46.8



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 110.0 | 127.01951292939152 | 6.392058808295182 | 0.0 | 1912 | 0 | 87.00463700006367 | 2803.938766999977 |
| Aggregated | Passed ✅ | 110.0 | 127.01951292939152 | 6.392058808295182 | 0.0 | 1912 | 0 | 87.00463700006367 | 2803.938766999977 |

1.46.7

What's Changed
* feat(prometheus_api.py): support querying prometheus metrics for all-up + key-level spend on UI by krrishdholakia in https://github.com/BerriAI/litellm/pull/5782
* [Fix-Bedrock] use Bedrock converse for `"meta.llama3-8b-instruct-v1:0", "meta.llama3-70b-instruct-v1:0"` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5729
* [Feat] add Groq gemma2 9b pricing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5788
* LiteLLM Minor Fixes & Improvements (09/18/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5772
* [Feat] Add Azure gpt-35-turbo-0301 pricing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5790
* test: replace gpt-3.5-turbo-0613 (deprecated model) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5794
* [Chore-Docs] fix curl on /get team info swagger by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5792


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.46.6...v1.46.7



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.46.7



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 150.0 | 168.9139116122553 | 6.325020266340649 | 0.0 | 1893 | 0 | 116.5782520000107 | 1552.0026590000384 |
| Aggregated | Passed ✅ | 150.0 | 168.9139116122553 | 6.325020266340649 | 0.0 | 1893 | 0 | 116.5782520000107 | 1552.0026590000384 |

1.46.6

What's Changed
* [Feat - GCS Bucket Logger] Use StandardLoggingPayload by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5771
* [Prometheus] track requested model by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5774
* [Feat-Proxy] Add Azure Assistants API - Create Assistant, Delete Assistant Support by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5777
* [Chore LiteLLM Proxy] enforce prometheus metrics as enterprise feature by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5769
* [Chore-Proxy] enforce jwt auth as enterprise feature by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5770


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.46.5...v1.46.6



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.46.6



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 150.0 | 170.55362281462155 | 6.40033081425939 | 0.0 | 1915 | 0 | 115.48961699998017 | 1217.0262289999982 |
| Aggregated | Passed ✅ | 150.0 | 170.55362281462155 | 6.40033081425939 | 0.0 | 1915 | 0 | 115.48961699998017 | 1217.0262289999982 |

1.46.5

What's Changed
* LiteLLM Minor Fixes & Improvements (09/17/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5742
* Additional Fixes (09/17/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5759


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.46.4...v1.46.5



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.46.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 120.0 | 144.0172449244393 | 6.411408280375216 | 0.0 | 1919 | 0 | 89.36769899992214 | 4350.86144600001 |
| Aggregated | Passed ✅ | 120.0 | 144.0172449244393 | 6.411408280375216 | 0.0 | 1919 | 0 | 89.36769899992214 | 4350.86144600001 |

Page 13 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.