Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 43 of 93

1.38.8

Not secure
What's Changed
* feat(slack_alerting.py): enable provider-region based alerting by krrishdholakia in https://github.com/BerriAI/litellm/pull/3844


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.7...v1.38.8



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.8



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 74 | 86.12069489644486 | 6.487708071155493 | 0.0 | 1941 | 0 | 62.97004400005335 | 733.9951239999891 |
| Aggregated | Passed ✅ | 74 | 86.12069489644486 | 6.487708071155493 | 0.0 | 1941 | 0 | 62.97004400005335 | 733.9951239999891 |

v1.38.7-stable
What's Changed
* [Feat] - Admin UI - New Activity Tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3836
* [Feat] Ui Enforce premium features on ui by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3840
* fix(proxy_server.py): fix model check for `/v1/models` + `/model/info` endpoint when team has restricted access by krrishdholakia in https://github.com/BerriAI/litellm/pull/3839
* [Fix] Set budget_duration on `/team/new` and `/team/update` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3842
* [Feat] Reset Team Budgets on `budget_reset_at` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3843
* [Feature]: Attach litellm exception in error string by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3824
* docs- email notifs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3845


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.5...v1.38.7-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.7-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 96 | 117.54187770999512 | 6.456232729004693 | 0.0 | 1931 | 0 | 80.74312700000519 | 802.6662359999932 |
| Aggregated | Passed ✅ | 96 | 117.54187770999512 | 6.456232729004693 | 0.0 | 1931 | 0 | 80.74312700000519 | 802.6662359999932 |

1.38.7

Not secure
🔥 [Fix] Set budget_duration on /team/new and /team/update

🔥 [Feat] Supporting for Resetting Team Budgets on budget_reset_at https://docs.litellm.ai/docs/proxy/users

⚒️ [Feature]: Attach litellm exception in error string - ContentPolicyViolation, AuthenticationError

📧 [Docs]- setting up Email notifications https://docs.litellm.ai/docs/proxy/email


![pika-1716692441692-1x](https://github.com/BerriAI/litellm/assets/29436595/ad275b74-fd94-4724-a881-8783d2155918)



What's Changed
* [Feat] - Admin UI - New Activity Tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3836
* [Feat] Ui Enforce premium features on ui by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3840
* fix(proxy_server.py): fix model check for `/v1/models` + `/model/info` endpoint when team has restricted access by krrishdholakia in https://github.com/BerriAI/litellm/pull/3839
* [Fix] Set budget_duration on `/team/new` and `/team/update` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3842
* [Feat] Reset Team Budgets on `budget_reset_at` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3843
* [Feature]: Attach litellm exception in error string by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3824
* docs- email notifs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3845

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.5...v1.38.7



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.7



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 110.0 | 127.76486134384693 | 6.465849619551454 | 0.0 | 1934 | 0 | 97.91651000000456 | 1353.8686059999918 |
| Aggregated | Passed ✅ | 110.0 | 127.76486134384693 | 6.465849619551454 | 0.0 | 1934 | 0 | 97.91651000000456 | 1353.8686059999918 |

1.38.5

Not secure
What's Changed
* Add Opus model by tjandy98 in https://github.com/BerriAI/litellm/pull/3832
* [Feat] Admin UI - View Spend Per Provider by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3835

New Contributors
* tjandy98 made their first contribution in https://github.com/BerriAI/litellm/pull/3832

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.4...v1.38.5



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 71 | 92.62181647768831 | 6.587926333577433 | 0.0 | 1972 | 0 | 59.4076170000335 | 1686.8908579999697 |
| Aggregated | Passed ✅ | 71 | 92.62181647768831 | 6.587926333577433 | 0.0 | 1972 | 0 | 59.4076170000335 | 1686.8908579999697 |

v1.38.4-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.3...v1.38.4-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.4-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 4 | 6.111940249999265 | 1.4964333982183433 | 1.4964333982183433 | 448 | 448 | 2.8791880000085257 | 178.4691419999831 |
| /health/liveliness | Failed ❌ | 4 | 6.141000357172872 | 15.692509162566466 | 15.692509162566466 | 4698 | 4698 | 2.501442999999881 | 1469.4073329999924 |
| /health/readiness | Failed ❌ | 4 | 5.944122888232317 | 15.839480299891482 | 15.839480299891482 | 4742 | 4742 | 2.3691970000072615 | 1220.683407000024 |
| Aggregated | Failed ❌ | 4 | 6.045266954489857 | 33.028422860676294 | 33.028422860676294 | 9888 | 9888 | 2.3691970000072615 | 1469.4073329999924 |

1.38.4

Not secure
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.3...v1.38.4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 8 | 10.442527492812866 | 1.6264559424753455 | 1.6264559424753455 | 487 | 487 | 6.631490000017948 | 205.98868900000866 |
| /health/liveliness | Failed ❌ | 8 | 10.048330618204664 | 15.666745022899066 | 15.666745022899066 | 4691 | 4691 | 6.471762000018089 | 410.1658490000091 |
| /health/readiness | Failed ❌ | 8 | 10.116838345250969 | 15.506437250334761 | 15.506437250334761 | 4643 | 4643 | 6.338718999984394 | 276.63078300000166 |
| Aggregated | Failed ❌ | 8 | 10.100265783117646 | 32.79963821570917 | 32.79963821570917 | 9821 | 9821 | 6.338718999984394 | 410.1658490000091 |

1.38.3

Not secure
What's Changed
* Add return_exceptions to batch_completion (retry) by ffreemt in https://github.com/BerriAI/litellm/pull/3462
* Fix issue with delta being None when Deferred / Async Content Filter is enabled on Azure OpenAI by afbarbaro in https://github.com/BerriAI/litellm/pull/3812
* docs - using vllm with litellm proxy server by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3822
* Log errors in Traceloop Integration by nirga in https://github.com/BerriAI/litellm/pull/3780
* [Feat] Enterprise - Send Email Alerts when user, key crosses budget by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3826
* fix(slack_alerting.py): support region based outage alerting by krrishdholakia in https://github.com/BerriAI/litellm/pull/3828
* [Feat] - send Email alerts when making new key by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3829
* Revert "Log errors in Traceloop Integration" by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3831


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.2...v1.38.3



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.3



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 22 | 24.895043136564478 | 1.5163419416402713 | 1.5163419416402713 | 454 | 454 | 17.791474000006247 | 181.73625299999685 |
| /health/liveliness | Failed ❌ | 21 | 24.081051182476717 | 15.667753410208178 | 15.667753410208178 | 4691 | 4691 | 17.23909200001117 | 1213.6252360000128 |
| /health/readiness | Failed ❌ | 21 | 24.068655364948725 | 15.49407547856656 | 15.49407547856656 | 4639 | 4639 | 17.314572000003636 | 1084.6369509999931 |
| Aggregated | Failed ❌ | 21 | 24.11294490177805 | 32.678170830415006 | 32.678170830415006 | 9784 | 9784 | 17.23909200001117 | 1213.6252360000128 |

1.38.2

Not secure
What's Changed
* [Feat]- Proxy Add OpenAI Content Moderation Pre call hook by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3802
* Athina callback now handles streaming mode by vivek-athina in https://github.com/BerriAI/litellm/pull/3071
* [Fix] async_post_call_streaming_hook not triggered on proxy server by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3797
* [Feat] Add Lakera AI Prompt Injection Detection by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3803
* [Feat] Add Cost Tracking for `vertex_ai/imagegeneration006` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3809
* [Feat] LiteLLM Proxy - Require Enterprise License Key for Premium Guardrail Features by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3806
* Added script for auto updating the pricing with the pricings of openrouter.ai by vietpham1911 in https://github.com/BerriAI/litellm/pull/3630
* build(model_prices_and_context_window.json): update Anyscale model list by danielbichuetti in https://github.com/BerriAI/litellm/pull/3807
* fix(factory.py): Ollama vision fix. by miesgre in https://github.com/BerriAI/litellm/pull/3792
* feat(databricks.py): adds databricks support - completion, embeddings, async, streaming by krrishdholakia in https://github.com/BerriAI/litellm/pull/3808

New Contributors
* vietpham1911 made their first contribution in https://github.com/BerriAI/litellm/pull/3630
* danielbichuetti made their first contribution in https://github.com/BerriAI/litellm/pull/3807
* miesgre made their first contribution in https://github.com/BerriAI/litellm/pull/3792

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.38.1...v1.38.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 22 | 22.735412932242443 | 1.4295949356664561 | 1.4295949356664561 | 428 | 428 | 17.413049000026604 | 110.43518400003904 |
| /health/liveliness | Failed ❌ | 21 | 23.353646324823114 | 15.568556063414373 | 15.568556063414373 | 4661 | 4661 | 16.87973199989301 | 1207.890811000027 |
| /health/readiness | Failed ❌ | 21 | 23.329244846860398 | 15.638699740164363 | 15.638699740164363 | 4682 | 4682 | 16.783239999995203 | 453.8819480000029 |
| Aggregated | Failed ❌ | 21 | 23.314873260464807 | 32.63685073924519 | 32.63685073924519 | 9771 | 9771 | 16.783239999995203 | 1207.890811000027 |

Page 43 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.