Litellm

Latest version: v1.65.1

Safety actively analyzes 723144 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 112

1.57.2

Not secure
What's Changed
* Prompt Management - support router + optional params by krrishdholakia in https://github.com/BerriAI/litellm/pull/7594
* `aiohttp_openai/` fixes - allow using `aiohttp_openai/gpt-4o` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7598
* (Fix) security of base image by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7620
* Litellm dev 01 07 2025 p1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7618
* (Feat) soft budget alerts on keys by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7623
* LiteLLM Minor Fixes & Improvement (01/01/2025) - p2 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7615


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.57.1...v1.57.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.57.2



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 190.0 | 212.2353391522645 | 6.34173008698281 | 0.0 | 1898 | 0 | 174.4866640000282 | 3470.5951910000013 |
| Aggregated | Passed βœ… | 190.0 | 212.2353391522645 | 6.34173008698281 | 0.0 | 1898 | 0 | 174.4866640000282 | 3470.5951910000013 |

1.57.1

Not secure
What's Changed
* (perf) - fixes for aiohttp handler to hit 1K RPS by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7590
* (latency/perf fixes - proxy) - use `async_service_success_hook` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7591
* (Feat) - allow including dd-trace in litellm base image by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7587
* (proxy perf improvement) - remove redundant `.copy()` operation by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7564
* Refresh VoyageAI models, prices and context by fzowl in https://github.com/BerriAI/litellm/pull/7472
* LiteLLM Minor Fixes & Improvements (01/06/2025) - p3 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7596
* LiteLLM Minor Fixes & Improvements (01/06/2025) - p2 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7597


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.57.0...v1.57.1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.57.1



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.57.1



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 250.0 | 286.96666935492755 | 6.035628429692609 | 0.0 | 1806 | 0 | 226.66728699999794 | 3887.529271000062 |
| Aggregated | Passed βœ… | 250.0 | 286.96666935492755 | 6.035628429692609 | 0.0 | 1806 | 0 | 226.66728699999794 | 3887.529271000062 |

1.57.1.dev1

What's Changed
* Prompt Management - support router + optional params by krrishdholakia in https://github.com/BerriAI/litellm/pull/7594
* `aiohttp_openai/` fixes - allow using `aiohttp_openai/gpt-4o` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7598


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.57.1...v1.57.1.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.57.1.dev1



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 230.0 | 257.3333526549675 | 6.19223606950461 | 0.003343539994332943 | 1852 | 1 | 91.99427400000104 | 1634.9057550000339 |
| Aggregated | Passed βœ… | 230.0 | 257.3333526549675 | 6.19223606950461 | 0.003343539994332943 | 1852 | 1 | 91.99427400000104 | 1634.9057550000339 |

1.57.0

Not secure
What's Changed
* (Fix) make sure `init` custom loggers is non blocking by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7554
* (Feat) Hashicorp Secret Manager - Allow storing virtual keys in secret manager by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7549
* Create and view organizations + assign org admins on the Proxy UI by krrishdholakia in https://github.com/BerriAI/litellm/pull/7557
* (perf) fix [PROXY] don't use `f` string in `add_litellm_data_to_request()` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7558
* fix(groq/chat/transformation.py): fix groq response_format transforma… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7565
* Support deleting keys by key_alias by krrishdholakia in https://github.com/BerriAI/litellm/pull/7552
* (proxy perf improvement) - use `asyncio.create_task` for `service_logger_obj.async_service_success_hook` in pre_call by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7563
* add `fireworks_ai/accounts/fireworks/models/deepseek-v3` by Fredy in https://github.com/BerriAI/litellm/pull/7567
* FriendliAI: Documentation Updates by minpeter in https://github.com/BerriAI/litellm/pull/7517
* Prevent istio injection for db migrations cron job by lowjiansheng in https://github.com/BerriAI/litellm/pull/7513

New Contributors
* Fredy made their first contribution in https://github.com/BerriAI/litellm/pull/7567
* minpeter made their first contribution in https://github.com/BerriAI/litellm/pull/7517

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.56.10...v1.57.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.57.0



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 200.0 | 212.84027329611826 | 6.1961289027318704 | 0.0 | 1854 | 0 | 174.45147399996586 | 1346.3216149999653 |
| Aggregated | Passed βœ… | 200.0 | 212.84027329611826 | 6.1961289027318704 | 0.0 | 1854 | 0 | 174.45147399996586 | 1346.3216149999653 |

1.57.0dev1

What's Changed
* (perf) - fixes for aiohttp handler to hit 1K RPS by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7590
* (latency/perf fixes - proxy) - use `async_service_success_hook` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7591
* (Feat) - allow including dd-trace in litellm base image by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7587
* (proxy perf improvement) - remove redundant `.copy()` operation by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7564
* Refresh VoyageAI models, prices and context by fzowl in https://github.com/BerriAI/litellm/pull/7472
* LiteLLM Minor Fixes & Improvements (01/06/2025) - p3 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7596
* LiteLLM Minor Fixes & Improvements (01/06/2025) - p2 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7597


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.57.0...v1.57.0-dev1

1.56.10

Not secure
What's Changed
* fix(aws_secret_manager_V2.py): Error reading secret from AWS Secrets … by krrishdholakia in https://github.com/BerriAI/litellm/pull/7541
* Support checking provider-specific `/models` endpoints for available models based on key by krrishdholakia in https://github.com/BerriAI/litellm/pull/7538
* feat(router.py): support request prioritization for text completion c… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7540
* (Fix) - Docker build error with pyproject.toml by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7550
* (Fix) - Slack Alerting , don't send duplicate spend report when used on multi instance settings by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7546
* add `cohere/command-r7b-12-2024` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7553


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.56.9...v1.56.10



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.56.10



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 230.0 | 268.3301603401397 | 6.21711064668469 | 0.0 | 1861 | 0 | 212.36320399998476 | 3556.7401620000396 |
| Aggregated | Passed βœ… | 230.0 | 268.3301603401397 | 6.21711064668469 | 0.0 | 1861 | 0 | 212.36320399998476 | 3556.7401620000396 |

Page 12 of 112

Links

Releases

Has known vulnerabilities

Β© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.