Litellm

Latest version: v1.65.1

Safety actively analyzes 723158 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 19 of 112

1.53.1

Not secure
🚨 Relevant Changes:
- LiteLLM Proxy Virtual Keys: Unique Key Aliases will be enforced on /key/generate and /key/update requests
- datadog integration will use [StandardLoggingPayload](https://docs.litellm.ai/docs/proxy/logging#what-gets-logged) (from LiteLLM v1.53.0+) & also supports logging failures https://github.com/BerriAI/litellm/pull/6929

If you need to use the v1 of the payload (not recommended), you can set this in your config
yaml
litellm_settings:
datadog_use_v1: True


Benefits of using StandardLoggingPayload for datadog
- It's a standard logging object so should be consistent over time across our logging integrations
- Added support for logging LLM failures
- Has additional info like cache_hit , request_tags etc. Full payload is here https://docs.litellm.ai/docs/proxy/logging#what-gets-logged



What's Changed
* LiteLLM Minor Fixes & Improvements (11/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6890
* (feat) pass through llm endpoints - add `PATCH` support (vertex context caching requires for update ops) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6924
* sonnet supports pdf, haiku does not by paul-gauthier in https://github.com/BerriAI/litellm/pull/6928
* (feat) DataDog Logger - Add Failure logging + use Standard Logging payload by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6929
* (feat) log proxy auth errors on datadog by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6931
* (feat) Allow using include to include external YAML files in a config.yaml by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6922
* (feat) dd logger - set tags according to the values set by those env vars by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6933
* LiteLLM Minor Fixes & Improvements (11/26/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6913
* LiteLLM Minor Fixes & Improvements (11/27/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6943
* Update Argilla integration documentation by sdiazlor in https://github.com/BerriAI/litellm/pull/6923
* (bug fix) /key/update was not storing `budget_duration` in the DB by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6941
* (fix) handle json decode errors for DD exception logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6934
* (docs + fix) Add docs on Moderations endpoint, Text Completion by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6947
* (feat) add enforcement for unique key aliases on /key/update and /key/generate by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6944
* (fix) tag merging / aggregation logic by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6932
* (feat) Allow disabling ErrorLogs written to the DB by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6940

New Contributors
* sdiazlor made their first contribution in https://github.com/BerriAI/litellm/pull/6923

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.16...v1.53.1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 200.0 | 215.7709455547284 | 6.292082946554957 | 0.0 | 1882 | 0 | 178.3981389999667 | 2851.1550680000255 |
| Aggregated | Passed ✅ | 200.0 | 215.7709455547284 | 6.292082946554957 | 0.0 | 1882 | 0 | 178.3981389999667 | 2851.1550680000255 |

v1.52.15.staging1
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.15...v1.52.15.staging1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.15.staging1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 250.0 | 278.6646185965574 | 6.211416620977041 | 0.0033412676820747935 | 1859 | 1 | 217.41687699994827 | 3149.612769999976 |
| Aggregated | Passed ✅ | 250.0 | 278.6646185965574 | 6.211416620977041 | 0.0033412676820747935 | 1859 | 1 | 217.41687699994827 | 3149.612769999976 |

v1.52.15-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.15...v1.52.15-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_nov27-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 200.0 | 221.52349298020383 | 6.244722043862887 | 0.0 | 1869 | 0 | 181.6640519999737 | 2200.3593760000513 |
| Aggregated | Passed ✅ | 200.0 | 221.52349298020383 | 6.244722043862887 | 0.0 | 1869 | 0 | 181.6640519999737 | 2200.3593760000513 |

1.53.1.dev1

Not secure
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.53.1...v1.53.1.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.1.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 250.0 | 285.15501328346113 | 6.138794444114975 | 0.0 | 1838 | 0 | 223.90917799998533 | 2684.1706850000264 |
| Aggregated | Passed ✅ | 250.0 | 285.15501328346113 | 6.138794444114975 | 0.0 | 1838 | 0 | 223.90917799998533 | 2684.1706850000264 |

1.52.16

Not secure
What's Changed
* feat - allow sending `tags` on vertex pass through requests by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6876
* (feat) Add support for using google/generative-ai JS with LiteLLM Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6899
* (UI fix) UI does not reload when you login / open a new tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6909
* (fix) pass through endpoints - run logging async + use thread pool executor for sync logging callbacks by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6907
* (redis fix) - fix `AbstractConnection.__init__() got an unexpected keyword argument 'ssl'` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6908
* (docs) Simplify `/vertex_ai/` pass through docs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6910


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.15...v1.52.16



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.16



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 230.0 | 261.12838541230485 | 6.194778256251999 | 0.0 | 1853 | 0 | 206.21302299997524 | 2167.8605710000056 |
| Aggregated | Passed ✅ | 230.0 | 261.12838541230485 | 6.194778256251999 | 0.0 | 1853 | 0 | 206.21302299997524 | 2167.8605710000056 |

1.52.16.dev4

What's Changed
* LiteLLM Minor Fixes & Improvements (11/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6890
* (feat) pass through llm endpoints - add `PATCH` support (vertex context caching requires for update ops) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6924
* sonnet supports pdf, haiku does not by paul-gauthier in https://github.com/BerriAI/litellm/pull/6928
* (feat) DataDog Logger - Add Failure logging + use Standard Logging payload by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6929
* (feat) log proxy auth errors on datadog by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6931
* (feat) Allow using include to include external YAML files in a config.yaml by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6922


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.16...v1.52.16.dev4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.16.dev4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 200.0 | 223.2904107688168 | 6.215640236973433 | 0.0 | 1860 | 0 | 174.75808199998255 | 3944.991313999992 |
| Aggregated | Passed ✅ | 200.0 | 223.2904107688168 | 6.215640236973433 | 0.0 | 1860 | 0 | 174.75808199998255 | 3944.991313999992 |

1.52.16.dev1

What's Changed
* LiteLLM Minor Fixes & Improvements (11/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6890
* (feat) pass through llm endpoints - add `PATCH` support (vertex context caching requires for update ops) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6924
* sonnet supports pdf, haiku does not by paul-gauthier in https://github.com/BerriAI/litellm/pull/6928
* (feat) DataDog Logger - Add Failure logging + use Standard Logging payload by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6929
* (feat) log proxy auth errors on datadog by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6931
* (feat) Allow using include to include external YAML files in a config.yaml by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6922
* (feat) dd logger - set tags according to the values set by those env vars by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6933


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.16...v1.52.16.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.16.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.16.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 250.0 | 285.0974372649336 | 6.039486955708498 | 0.0 | 1808 | 0 | 224.19419400000606 | 3263.23956899995 |
| Aggregated | Passed ✅ | 250.0 | 285.0974372649336 | 6.039486955708498 | 0.0 | 1808 | 0 | 224.19419400000606 | 3263.23956899995 |

1.52.15

Not secure
What's Changed
* (feat) use `google-cloud/vertexai` js sdk with litellm by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6873
* (chore) fix new .js tests running for vertex.js by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6875
* Bump cross-spawn from 7.0.3 to 7.0.6 in /ui/litellm-dashboard by dependabot in https://github.com/BerriAI/litellm/pull/6865
* (Perf / latency improvement) improve pass through endpoint latency to ~50ms (before PR was 400ms) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6874
* LiteLLM Minor Fixes & Improvements (11/23/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6870
* Litellm dev 11 23 2024 by krrishdholakia in https://github.com/BerriAI/litellm/pull/6881
* docs - have 1 section for routing +load balancing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6884
* (QOL improvement) Provider budget routing - allow using 1s, 1d, 1mo, 2mo etc by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6885
* (feat) - provider budget improvements - ensure provider budgets work with multiple proxy instances + improve latency to ~90ms by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6886


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.14...v1.52.15



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.15



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 280.0 | 454.59782761891177 | 5.830264051408934 | 0.010023376592680114 | 1745 | 3 | 139.27931299997454 | 5766.263976999994 |
| Aggregated | Failed ❌ | 280.0 | 454.59782761891177 | 5.830264051408934 | 0.010023376592680114 | 1745 | 3 | 139.27931299997454 | 5766.263976999994 |

Page 19 of 112

Links

Releases

Has known vulnerabilities

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.