Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 63 of 93

1.34.5

Not secure
What's Changed
* fix(handle_jwt.py): enable team-based jwt-auth access by krrishdholakia in https://github.com/BerriAI/litellm/pull/2704
* (fix) Remove print statements from append_query_params by antoniomdk in https://github.com/BerriAI/litellm/pull/2697
* Fix Ollama embedding by onukura in https://github.com/BerriAI/litellm/pull/2675
* enable new `/team/disable` endpoint by krrishdholakia in https://github.com/BerriAI/litellm/pull/2705
* feat(llm_guard.py): enable key-specific llm guard check by krrishdholakia in https://github.com/BerriAI/litellm/pull/2706

New Contributors
* antoniomdk made their first contribution in https://github.com/BerriAI/litellm/pull/2697
* onukura made their first contribution in https://github.com/BerriAI/litellm/pull/2675

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.34.4.dev2...v1.34.5

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 80 | 91.09598086965796 | 1.5630632104195936 | 0.0 | 468 | 0 | 75.90920399997003 | 1370.1692500000036 |
| /health/liveliness | Passed ✅ | 66 | 68.46346034574393 | 15.340062661233318 | 0.0 | 4593 | 0 | 63.58234399999674 | 1380.5088000000012 |
| /health/readiness | Passed ✅ | 66 | 69.07605037803246 | 15.416879870292401 | 0.0 | 4616 | 0 | 63.49641600002087 | 1383.9997339999854 |
| Aggregated | Passed ✅ | 66 | 69.8502264090107 | 32.320005741945316 | 0.0 | 9677 | 0 | 63.49641600002087 | 1383.9997339999854 |

1.34.4

Not secure

1.34.4.dev2

What's Changed
* [Feat] Proxy - /cache/flushall - delete all elements from cache by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2702
* (fix) Proxy fix cache control logic - fix `no-store` logic by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2696
* [Fix] remove litellm telemetry from proxy server by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2703


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.34.4.dev1...v1.34.4.dev2

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 92 | 99.04622922560984 | 1.6434154232887563 | 0.0 | 492 | 0 | 86.46199399998977 | 496.77466399998593 |
| /health/liveliness | Passed ✅ | 78 | 85.9373431366058 | 15.037919178142237 | 0.0 | 4502 | 0 | 73.93678500000078 | 1486.8835759999683 |
| /health/readiness | Passed ✅ | 78 | 84.57857246170724 | 15.265057895182146 | 0.003340275250586903 | 4570 | 1 | 73.80300699998088 | 1357.8377890000013 |
| Aggregated | Passed ✅ | 79 | 85.96243619092432 | 31.94639249661314 | 0.003340275250586903 | 9564 | 1 | 73.80300699998088 | 1486.8835759999683 |

1.34.4.dev1

1.34.3

Not secure

1.34.2

What's Changed
* Updating the default Anthropic Officlal Claude 3 max_tokens to 4096 by Caixiaopig in https://github.com/BerriAI/litellm/pull/2855
* add test for rate limits - Router isn't coroutine safe by CLARKBENHAM in https://github.com/BerriAI/litellm/pull/2798
* [integrations/langfuse] Use packaging over deprecated pkg_resources by nicovank in https://github.com/BerriAI/litellm/pull/2844
* [Feat] Text-Completion-OpenAI - Re-use OpenAI Client by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2877
* re-use Azure OpenAI client for azure text completions by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2878

New Contributors
* Caixiaopig made their first contribution in https://github.com/BerriAI/litellm/pull/2855
* CLARKBENHAM made their first contribution in https://github.com/BerriAI/litellm/pull/2798
* nicovank made their first contribution in https://github.com/BerriAI/litellm/pull/2844

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.34.29...1.34.2

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 41 | 44.78511420041374 | 1.6166803681929298 | 0.0 | 484 | 0 | 35.75204999998505 | 621.7158040000186 |
| /health/liveliness | Passed ✅ | 25 | 26.808032545890264 | 15.358463497832833 | 0.0 | 4598 | 0 | 23.034784999993008 | 367.92435199998863 |
| /health/readiness | Passed ✅ | 25 | 26.462279976185147 | 15.70918961076725 | 0.0 | 4703 | 0 | 23.12188700000206 | 304.2405280000082 |
| Aggregated | Passed ✅ | 25 | 27.531060975677224 | 32.684333476793014 | 0.0 | 9785 | 0 | 23.034784999993008 | 621.7158040000186 |

Page 63 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.