Litellm

Latest version: v1.65.3

Safety actively analyzes 723929 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 17 of 113

1.54.0

Not secure
What's Changed
* (feat) Track `custom_llm_provider` in LiteLLMSpendLogs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7081
* Add MLflow to the side bar by B-Step62 in https://github.com/BerriAI/litellm/pull/7031
* (bug fix) SpendLogs update DB catch all possible DB errors for retrying by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7082
* (Feat) Add StructuredOutputs support for Fireworks.AI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7085
* added deepinfra/Meta-Llama-3.1-405B-Instruct to the Model json by AliSayyah in https://github.com/BerriAI/litellm/pull/7084
* (feat) Add created_at and updated_at for LiteLLM_UserTable by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7089

New Contributors
* AliSayyah made their first contribution in https://github.com/BerriAI/litellm/pull/7084

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.9...v1.54.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.54.0



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 200.0 | 220.2003271503722 | 6.29832230581454 | 0.0 | 1882 | 0 | 179.34225999999853 | 1827.969679000006 |
| Aggregated | Passed βœ… | 200.0 | 220.2003271503722 | 6.29832230581454 | 0.0 | 1882 | 0 | 179.34225999999853 | 1827.969679000006 |

1.53.9

Not secure
What's Changed
* LiteLLM Minor Fixes & Improvements (12/06/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7067
* feat(langfuse/): support langfuse prompt management by krrishdholakia in https://github.com/BerriAI/litellm/pull/7073


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.8...v1.53.9



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.9



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 230.0 | 262.5035242475303 | 6.0929671578673235 | 0.0 | 1822 | 0 | 209.27508400001216 | 2657.453161000035 |
| Aggregated | Passed βœ… | 230.0 | 262.5035242475303 | 6.0929671578673235 | 0.0 | 1822 | 0 | 209.27508400001216 | 2657.453161000035 |

1.53.8

Not secure
What's Changed
* (UI) Fix viewing home page keys on a new DB by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7042
* build(model_prices_and_context_window.json): add bedrock region model… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7044
* Update SearchBar by yujonglee in https://github.com/BerriAI/litellm/pull/6982
* (fix) litellm router.aspeech by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6962
* (UI) perf improvement - cache internal user tab results by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7054
* (fix) adding public routes when using custom header by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7045
* LiteLLM Minor Fixes & Improvements (12/05/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7051
* feat: Add gemini-exp-1206 model configuration with 2M input tokens by paulmaunders in https://github.com/BerriAI/litellm/pull/7064
* Correct Vertex Embedding Model Data/Prices by emerzon in https://github.com/BerriAI/litellm/pull/7069
* litellm not honoring OPENAI_ORGANIZATION env var by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7066
* Provider Budget Routing - Get Budget, Spend Details by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7063
* Feat - add groq/llama3.3 models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7076
* (feat) Allow enabling logging message / response for specific virtual keys by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7071

New Contributors
* paulmaunders made their first contribution in https://github.com/BerriAI/litellm/pull/7064

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.7...v1.53.8



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.8



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 220.0 | 252.68788116416482 | 6.227999496173144 | 0.0 | 1864 | 0 | 198.31458400000201 | 2829.406032999941 |
| Aggregated | Passed βœ… | 220.0 | 252.68788116416482 | 6.227999496173144 | 0.0 | 1864 | 0 | 198.31458400000201 | 2829.406032999941 |

1.53.7

Not secure
What's Changed
* LiteLLM Minor Fixes & Improvements (12/05/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7037


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.6...v1.53.7



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.7



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 250.0 | 285.5731596761653 | 6.103319742596985 | 0.0 | 1825 | 0 | 229.3374330000688 | 1651.5534569999772 |
| Aggregated | Passed βœ… | 250.0 | 285.5731596761653 | 6.103319742596985 | 0.0 | 1825 | 0 | 229.3374330000688 | 1651.5534569999772 |

1.53.7.dev4

What's Changed
* (UI) Fix viewing home page keys on a new DB by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7042


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.7...v1.53.7.dev4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.7.dev4



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 260.0 | 301.42724034818906 | 5.997929238673837 | 0.0 | 1795 | 0 | 228.56110200001467 | 3614.213866 |
| Aggregated | Failed ❌ | 260.0 | 301.42724034818906 | 5.997929238673837 | 0.0 | 1795 | 0 | 228.56110200001467 | 3614.213866 |

1.53.7.dev2

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.7...v1.53.7.dev2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.7.dev2



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 270.0 | 309.7098386657411 | 6.030439383998993 | 0.0 | 1804 | 0 | 234.0091040000516 | 3472.105875000011 |
| Aggregated | Failed ❌ | 270.0 | 309.7098386657411 | 6.030439383998993 | 0.0 | 1804 | 0 | 234.0091040000516 | 3472.105875000011 |

v1.53.7-stable
What's Changed
* LiteLLM Minor Fixes & Improvements (12/04/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7037


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.6...v1.53.7-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.7-stable



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 250.0 | 287.825675572594 | 6.147983179332712 | 0.0 | 1839 | 0 | 225.9885929999541 | 1840.4691450000428 |
| Aggregated | Passed βœ… | 250.0 | 287.825675572594 | 6.147983179332712 | 0.0 | 1839 | 0 | 225.9885929999541 | 1840.4691450000428 |

Page 17 of 113

Links

Releases

Has known vulnerabilities

Β© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.