Litellm

Latest version: v1.65.3

Safety actively analyzes 723929 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 113

1.59.3.dev1

What's Changed
* Deepseek r1 support + watsonx qa improvements by krrishdholakia in https://github.com/BerriAI/litellm/pull/7907
* (Testing) - Add e2e testing for langfuse logging with tags by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7922
* build(deps): bump undici from 6.21.0 to 6.21.1 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/7902
* (test) add e2e test for proxy with fallbacks + custom fallback message by krrishdholakia in https://github.com/BerriAI/litellm/pull/7933


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.59.3...v1.59.3.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.3.dev1



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 230.0 | 259.2853146928995 | 6.073999238925992 | 0.0 | 1817 | 0 | 211.11294400003544 | 2538.129180999988 |
| Aggregated | Passed βœ… | 230.0 | 259.2853146928995 | 6.073999238925992 | 0.0 | 1817 | 0 | 211.11294400003544 | 2538.129180999988 |

1.59.2

Not secure
What's Changed
* Litellm dev 01 20 2025 p3 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7890
* (e2e testing + minor refactor) - Virtual Key Max budget check by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7888
* fix(proxy_server.py): fix get model info when litellm_model_id is set + move model analytics to free by krrishdholakia in https://github.com/BerriAI/litellm/pull/7886
* fix: add default credential for azure (7095) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7891
* (Bug fix) - Allow setting `null` for `max_budget`, `rpm_limit`, `tpm_limit` when updating values on a team by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7912
* (fix langfuse tags) - read tags from `StandardLoggingPayload` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7903
* (Feat) Add x-litellm-overhead-duration-ms and "x-litellm-response-duration-ms" in response from LiteLLM by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7899
* (Code quality) - Ban recursive functions in codebase by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7910
* Litellm dev 01 21 2025 p1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7898
* (Feat - prometheus) - emit `litellm_overhead_latency_metric` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7913


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.59.1...v1.59.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.2



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 250.0 | 277.37964377510815 | 6.123201928767048 | 0.0 | 1832 | 0 | 225.21770500003413 | 1457.6771990000168 |
| Aggregated | Passed βœ… | 250.0 | 277.37964377510815 | 6.123201928767048 | 0.0 | 1832 | 0 | 225.21770500003413 | 1457.6771990000168 |

1.59.1

Not secure
What's Changed
* fix(admins.tsx): fix logic for getting base url and create common get… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7854
* Fix: Problem with langfuse_tags when using litellm proxy with langfus… by yuu341 in https://github.com/BerriAI/litellm/pull/7825
* (UI - View Logs Table) - Show country of origin for logs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7856
* (UI Logs) - add pagination + filtering by key name/team name by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7860
* Revert "Remove UI build output" by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7861
* (Security) Add grype security scan to ci/cd pipeline by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7864
* LiteLLM Minor Fixes & Improvements (01/18/2025) - p1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7857
* feat(health_check.py): set upperbound for api when making health check call by krrishdholakia in https://github.com/BerriAI/litellm/pull/7865
* add new bedrock stability models & versions to model_prices_and_context_window.json by marty-sullivan in https://github.com/BerriAI/litellm/pull/7869
* Auth checks on invalid fallback models by krrishdholakia in https://github.com/BerriAI/litellm/pull/7871
* JWT Auth - `enforce_rbac` support + UI team view, spend calc fix by krrishdholakia in https://github.com/BerriAI/litellm/pull/7863
* Fix typo Update alerting.md by MonkeyKing44 in https://github.com/BerriAI/litellm/pull/7880
* typo fix README.md by VitalikBerashvili in https://github.com/BerriAI/litellm/pull/7879
* feat: add new together_ai models by theGitNoob in https://github.com/BerriAI/litellm/pull/7882
* fix(fireworks_ai/): fix global disable flag with transform messages h… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7847
* (Feat) `datadog_llm_observability` callback - emit `request_tags` on logs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7883
* Litellm dev 01 20 2025 p1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/7884

New Contributors
* yuu341 made their first contribution in https://github.com/BerriAI/litellm/pull/7825
* marty-sullivan made their first contribution in https://github.com/BerriAI/litellm/pull/7869
* MonkeyKing44 made their first contribution in https://github.com/BerriAI/litellm/pull/7880
* VitalikBerashvili made their first contribution in https://github.com/BerriAI/litellm/pull/7879
* theGitNoob made their first contribution in https://github.com/BerriAI/litellm/pull/7882

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.59.0...v1.59.1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.1



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 250.0 | 295.7613582714676 | 6.034086428263315 | 0.0 | 1805 | 0 | 224.12125900001456 | 3576.6714410000304 |
| Aggregated | Passed βœ… | 250.0 | 295.7613582714676 | 6.034086428263315 | 0.0 | 1805 | 0 | 224.12125900001456 | 3576.6714410000304 |

1.59.0

Not secure
What's Changed
* Add key & team level budget metric for prometheus by yujonglee in https://github.com/BerriAI/litellm/pull/7831
* fix(key_management_endpoints.py): fix default allowed team member roles by krrishdholakia in https://github.com/BerriAI/litellm/pull/7843
* (UI - View SpendLogs Table) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7842
* [fix dd llm obs] - use env vars for setting dd tags, service name by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7835
* [Hashicorp - secret manager] - use vault namespace for tls auth by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7834
* QA: ensure all bedrock regional models have same `supported_` as base + Anthropic nested pydantic object support by krrishdholakia in https://github.com/BerriAI/litellm/pull/7844
* 10x Bedrock perf improvement - refactor: make bedrock image transformation requests async by krrishdholakia in https://github.com/BerriAI/litellm/pull/7840
* `/key/delete` - allow team admin to delete team keys by krrishdholakia in https://github.com/BerriAI/litellm/pull/7846
* Improve Proxy Resiliency: Cooldown single-deployment model groups if 100% calls failed in high traffic by krrishdholakia in https://github.com/BerriAI/litellm/pull/7823
* LiteLLM Minor Fixes & Improvements (2024/16/01) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7826
* Remove UI build output by yujonglee in https://github.com/BerriAI/litellm/pull/7849
* Fix invalid base URL error by yujonglee in https://github.com/BerriAI/litellm/pull/7852
* Refactor logs UI by yujonglee in https://github.com/BerriAI/litellm/pull/7851


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.58.4...v1.59.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.0



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 250.0 | 285.129348931583 | 6.106818187164813 | 0.0 | 1827 | 0 | 224.69302100000732 | 2869.612018000055 |
| Aggregated | Passed βœ… | 250.0 | 285.129348931583 | 6.106818187164813 | 0.0 | 1827 | 0 | 224.69302100000732 | 2869.612018000055 |

1.58.4

Not secure
What's Changed
* build(pyproject.toml): bump uvicorn depedency requirement + Azure o1 model check fix + Vertex Anthropic headers fix by krrishdholakia in https://github.com/BerriAI/litellm/pull/7773
* Add `gemini/` frequency_penalty + presence_penalty support by krrishdholakia in https://github.com/BerriAI/litellm/pull/7776
* feat(helm): add securityContext and pull policy values to migration job by Hexoplon in https://github.com/BerriAI/litellm/pull/7652
* fix confusing save button label by yujonglee in https://github.com/BerriAI/litellm/pull/7778
* [integrations/lunary] Improve Lunary documentaiton by hughcrt in https://github.com/BerriAI/litellm/pull/7770
* Fix wrong URL for internal user invitation by yujonglee in https://github.com/BerriAI/litellm/pull/7762
* Update instructor tutorial by Winston-503 in https://github.com/BerriAI/litellm/pull/7784
* (helm) - allow specifying envVars on values.yaml + add helm lint test by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7789
* Fix anthropic pass-through end user tracking + add gemini-2.0-flash-thinking-exp by krrishdholakia in https://github.com/BerriAI/litellm/pull/7772
* Add back in non root image fixes (7781) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7795
* test: initial test to enforce all functions in user_api_key_auth.py h… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7797
* test: initial commit enforcing testing on all anthropic pass through … by krrishdholakia in https://github.com/BerriAI/litellm/pull/7794
* build: bump certifi version - see if that fixes asyncio ssl issue on … by krrishdholakia in https://github.com/BerriAI/litellm/pull/7800
* (datadog llm observability) - fixes + improvements for using `datadog llm observability` logging integration by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7824
* (fix) IBM Watsonx using ZenApiKey by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7821
* (Fix + Testing) - Add `dd-trace-run` to litellm ci/cd pipeline + fix bug caused by `dd-trace` patching OpenAI sdk by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7820
* (security fix) - remove hf model with exposed security token by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7810

New Contributors
* Winston-503 made their first contribution in https://github.com/BerriAI/litellm/pull/7784

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.58.2...v1.58.4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.58.4



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 200.0 | 237.21547618310757 | 6.133261155980474 | 0.0 | 1835 | 0 | 175.96439100003636 | 4047.4063279999655 |
| Aggregated | Passed βœ… | 200.0 | 237.21547618310757 | 6.133261155980474 | 0.0 | 1835 | 0 | 175.96439100003636 | 4047.4063279999655 |

1.58.4.dev1

What's Changed
* build(pyproject.toml): bump uvicorn depedency requirement + Azure o1 model check fix + Vertex Anthropic headers fix by krrishdholakia in https://github.com/BerriAI/litellm/pull/7773
* Add `gemini/` frequency_penalty + presence_penalty support by krrishdholakia in https://github.com/BerriAI/litellm/pull/7776
* feat(helm): add securityContext and pull policy values to migration job by Hexoplon in https://github.com/BerriAI/litellm/pull/7652
* fix confusing save button label by yujonglee in https://github.com/BerriAI/litellm/pull/7778
* [integrations/lunary] Improve Lunary documentaiton by hughcrt in https://github.com/BerriAI/litellm/pull/7770
* Fix wrong URL for internal user invitation by yujonglee in https://github.com/BerriAI/litellm/pull/7762
* Update instructor tutorial by Winston-503 in https://github.com/BerriAI/litellm/pull/7784
* (helm) - allow specifying envVars on values.yaml + add helm lint test by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7789
* Fix anthropic pass-through end user tracking + add gemini-2.0-flash-thinking-exp by krrishdholakia in https://github.com/BerriAI/litellm/pull/7772
* Add back in non root image fixes (7781) by krrishdholakia in https://github.com/BerriAI/litellm/pull/7795
* test: initial test to enforce all functions in user_api_key_auth.py h… by krrishdholakia in https://github.com/BerriAI/litellm/pull/7797
* test: initial commit enforcing testing on all anthropic pass through … by krrishdholakia in https://github.com/BerriAI/litellm/pull/7794
* build: bump certifi version - see if that fixes asyncio ssl issue on … by krrishdholakia in https://github.com/BerriAI/litellm/pull/7800
* (datadog llm observability) - fixes + improvements for using `datadog llm observability` logging integration by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7824
* (fix) IBM Watsonx using ZenApiKey by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7821
* (Fix + Testing) - Add `dd-trace-run` to litellm ci/cd pipeline + fix bug caused by `dd-trace` patching OpenAI sdk by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7820
* (security fix) - remove hf model with exposed security token by ishaan-jaff in https://github.com/BerriAI/litellm/pull/7810

New Contributors
* Winston-503 made their first contribution in https://github.com/BerriAI/litellm/pull/7784

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.58.2...v1.58.4.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.58.4.dev1



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 260.0 | 287.70018305057005 | 6.14584846296971 | 0.0 | 1839 | 0 | 230.36456500000213 | 3404.0822879999837 |
| Aggregated | Passed βœ… | 260.0 | 287.70018305057005 | 6.14584846296971 | 0.0 | 1839 | 0 | 230.36456500000213 | 3404.0822879999837 |

Page 9 of 113

Links

Releases

Has known vulnerabilities

Β© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.