Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 93

1.48.2.dev12

What's Changed
* [docs] updated langfuse integration guide by jannikmaierhoefer in https://github.com/BerriAI/litellm/pull/5921
* Upgrade dependencies in dockerfile by Jacobh2 in https://github.com/BerriAI/litellm/pull/5862
* [Fix Azure AI Studio] drop_params_from_unprocessable_entity_error by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5936

New Contributors
* jannikmaierhoefer made their first contribution in https://github.com/BerriAI/litellm/pull/5921

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.2.dev8...v1.48.2.dev12



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.2.dev12



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 98 | 120.99984995412262 | 6.481007707804045 | 0.0 | 1940 | 0 | 68.52679800005035 | 2842.6305009999737 |
| Aggregated | Passed ✅ | 98 | 120.99984995412262 | 6.481007707804045 | 0.0 | 1940 | 0 | 68.52679800005035 | 2842.6305009999737 |

1.48.2.dev10

What's Changed
* Add Llama 3.2 90b model on Vertex AI. by Manouchehri in https://github.com/BerriAI/litellm/pull/5908
* Update litellm helm envconfigmap by Pit-Storm in https://github.com/BerriAI/litellm/pull/5872
* LiteLLM Minor Fixes & Improvements (09/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5880


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.2...v1.48.2.dev10



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.2.dev10



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.2.dev10



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 162.42545019065167 | 6.373516606295465 | 0.0 | 1904 | 0 | 121.55692900000759 | 1051.541860000043 |
| Aggregated | Passed ✅ | 140.0 | 162.42545019065167 | 6.373516606295465 | 0.0 | 1904 | 0 | 121.55692900000759 | 1051.541860000043 |

1.48.2.dev8

What's Changed
* Add Llama 3.2 90b model on Vertex AI. by Manouchehri in https://github.com/BerriAI/litellm/pull/5908
* Update litellm helm envconfigmap by Pit-Storm in https://github.com/BerriAI/litellm/pull/5872
* LiteLLM Minor Fixes & Improvements (09/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5880
* LiteLLM Minor Fixes & Improvements (09/25/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5893
* [feat-Prometheus] Track api key alias and api key hash for remaining tokens metric by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5924
* [Fix proxy perf] Use correct cache key when reading from redis cache by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5928
* [Fix] Perf use only async functions for get cache by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5930


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.2...v1.48.2.dev8

1.48.2.dev6

What's Changed
* Add Llama 3.2 90b model on Vertex AI. by Manouchehri in https://github.com/BerriAI/litellm/pull/5908
* Update litellm helm envconfigmap by Pit-Storm in https://github.com/BerriAI/litellm/pull/5872
* LiteLLM Minor Fixes & Improvements (09/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5880


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.2...v1.48.2.dev6



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.2.dev6



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 110.0 | 132.71091504557205 | 6.450768453748066 | 0.0 | 1931 | 0 | 84.64533000000074 | 3263.52574699996 |
| Aggregated | Passed ✅ | 110.0 | 132.71091504557205 | 6.450768453748066 | 0.0 | 1931 | 0 | 84.64533000000074 | 3263.52574699996 |

1.48.2.dev4

What's Changed
* Add Llama 3.2 90b model on Vertex AI. by Manouchehri in https://github.com/BerriAI/litellm/pull/5908
* Update litellm helm envconfigmap by Pit-Storm in https://github.com/BerriAI/litellm/pull/5872
* LiteLLM Minor Fixes & Improvements (09/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5880


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.2...v1.48.2.dev4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.2.dev4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 172.45997077640794 | 6.2325066164844936 | 0.006683653208026267 | 1865 | 2 | 39.482324000005065 | 1918.3931500000426 |
| Aggregated | Passed ✅ | 140.0 | 172.45997077640794 | 6.2325066164844936 | 0.006683653208026267 | 1865 | 2 | 39.482324000005065 | 1918.3931500000426 |

1.48.1

What's Changed
* Update the dockerignore file by Jacobh2 in https://github.com/BerriAI/litellm/pull/5863
* [Admin UI - Proxy] Add Deepseek as a provider by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5857
* LiteLLM Minor Fixes & Improvements (09/23/2024) (5842) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5858
* [Fix] OTEL - Don't log messages when callback settings disable message logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5875
* [Perf Fix] Don't always read from Redis by Default by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5877

New Contributors
* Jacobh2 made their first contribution in https://github.com/BerriAI/litellm/pull/5863

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.0...v1.48.1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 97 | 112.91920489921188 | 6.362288677390206 | 0.0 | 1905 | 0 | 73.26003699995454 | 2652.709988999959 |
| Aggregated | Passed ✅ | 97 | 112.91920489921188 | 6.362288677390206 | 0.0 | 1905 | 0 | 73.26003699995454 | 2652.709988999959 |

Page 11 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.