Litellm

Latest version: v1.65.1

Safety actively analyzes 723158 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 18 of 112

1.53.5

Not secure
What's Changed
* LiteLLM Minor Fixes & Improvements (12/03/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7008
* Add prompt caching flag for Azure OpenAI gpt-4o-2024-08-06 by fengjiajie in https://github.com/BerriAI/litellm/pull/7020
* fix: Add credential templates in migration job when using existing DB by stevencrake-nscale in https://github.com/BerriAI/litellm/pull/6792

New Contributors
* fengjiajie made their first contribution in https://github.com/BerriAI/litellm/pull/7020
* stevencrake-nscale made their first contribution in https://github.com/BerriAI/litellm/pull/6792

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.4...v1.53.5



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 280.0 | 319.61681502986755 | 6.043486566751137 | 0.0 | 1808 | 0 | 233.45962199999803 | 4589.378371999999 |
| Aggregated | Failed ❌ | 280.0 | 319.61681502986755 | 6.043486566751137 | 0.0 | 1808 | 0 | 233.45962199999803 | 4589.378371999999 |

1.53.4

Not secure
What's Changed
* (QOL fix) - remove duplicate code from datadog logger by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7013
* (UI) Sub 1s Internal User Tab load time by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7007
* (fix) allow gracefully handling DB connection errors on proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7017
* (refactor) - migrate `router.deployment_callback_on_success` to use StandardLoggingPayload by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7015
* (fix) 'utf-8' codec can't encode characters error on OpenAI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7018


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.3...v1.53.4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 250.0 | 291.34812296252045 | 6.153959693113714 | 0.0 | 1841 | 0 | 223.70142199997645 | 2984.8669300000097 |
| Aggregated | Passed ✅ | 250.0 | 291.34812296252045 | 6.153959693113714 | 0.0 | 1841 | 0 | 223.70142199997645 | 2984.8669300000097 |

1.53.3

Not secure
What's Changed
* Litellm dev 11 30 2024 by krrishdholakia in https://github.com/BerriAI/litellm/pull/6974
* LiteLLM Minor Fixes & Improvements (12/02/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6994
* Litellm test ci cd by krrishdholakia in https://github.com/BerriAI/litellm/pull/6997
* (fix) logging Auth errors on datadog by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6995
* (fixes) datadog logging - handle 1MB max log size on DD by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6996
* Litellm dbrx structured outputs support by krrishdholakia in https://github.com/BerriAI/litellm/pull/6993


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.2...v1.53.3



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.3



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 260.0 | 295.3963784342538 | 6.049806369807933 | 0.0 | 1810 | 0 | 224.3657600000688 | 2447.638761999997 |
| Aggregated | Passed ✅ | 260.0 | 295.3963784342538 | 6.049806369807933 | 0.0 | 1810 | 0 | 224.3657600000688 | 2447.638761999997 |

1.53.3.dev2

What's Changed
* (QOL fix) - remove duplicate code from datadog logger by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7013
* (UI) Sub 1s Internal User Tab load time by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7007


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.3...v1.53.3.dev2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.3.dev2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 230.0 | 254.71533959127268 | 6.282620253609997 | 0.0 | 1879 | 0 | 197.70822599997473 | 3258.8738069999863 |
| Aggregated | Passed ✅ | 230.0 | 254.71533959127268 | 6.282620253609997 | 0.0 | 1879 | 0 | 197.70822599997473 | 3258.8738069999863 |

1.53.3dev1

What's Changed
* Litellm dev 11 30 2024 by krrishdholakia in https://github.com/BerriAI/litellm/pull/6974
* LiteLLM Minor Fixes & Improvements (12/02/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6994
* Litellm test ci cd by krrishdholakia in https://github.com/BerriAI/litellm/pull/6997
* (fix) logging Auth errors on datadog by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6995
* (fixes) datadog logging - handle 1MB max log size on DD by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6996
* Litellm dbrx structured outputs support by krrishdholakia in https://github.com/BerriAI/litellm/pull/6993


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.2...v1.53.3-dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.3-dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 270.0 | 322.8250497930263 | 5.940031623578464 | 0.0 | 1778 | 0 | 227.83484099994666 | 3640.05648899996 |
| Aggregated | Failed ❌ | 270.0 | 322.8250497930263 | 5.940031623578464 | 0.0 | 1778 | 0 | 227.83484099994666 | 3640.05648899996 |

1.53.2

Not secure
What's Changed
* fix(key_management_endpoints.py): support 'tags' param on `/key/update` by krrishdholakia in https://github.com/BerriAI/litellm/pull/6945
* LiteLLM Minor Fixes & Improvements (11/29/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6965
* Update team_endpoints.py by superpoussin22 in https://github.com/BerriAI/litellm/pull/6983


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.1...v1.53.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 250.0 | 318.2618058818948 | 6.0033656688808605 | 0.003344493408847276 | 1795 | 1 | 225.67902299999787 | 55505.375238 |
| Aggregated | Failed ❌ | 250.0 | 318.2618058818948 | 6.0033656688808605 | 0.003344493408847276 | 1795 | 1 | 225.67902299999787 | 55505.375238 |

Page 18 of 112

Links

Releases

Has known vulnerabilities

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.