Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 30 of 93

1.41.21

Not secure
What's Changed
* docs(pass_through.md): Creating custom chat endpoints on proxy by krrishdholakia in https://github.com/BerriAI/litellm/pull/4686
* [UI] Fix Cache Ratio Calc by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4692
* [Fix] Bug - Clear user_id from cache when /user/update is called by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4697
* fix - Raise `BadRequestError` when passing the wrong role by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4693
* Support key-rpm limits on pass-through endpoints by krrishdholakia in https://github.com/BerriAI/litellm/pull/4701


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.20...v1.41.21



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.21



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 100.0 | 125.95707122736675 | 6.423230319516362 | 0.0 | 1922 | 0 | 84.8700540000209 | 1397.4038310000196 |
| Aggregated | Passed ✅ | 100.0 | 125.95707122736675 | 6.423230319516362 | 0.0 | 1922 | 0 | 84.8700540000209 | 1397.4038310000196 |

1.41.20

Not secure
What's Changed
* Fix: Langfuse prompt logging by andreaponti5 in https://github.com/BerriAI/litellm/pull/4673
* [Fix] Reduce Mem Usage - only set ttl for requests to 2 mins by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4681
* show stack trace of 10 files taking up memory by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4682
* [Fix] Mem Util - De Reference when removing from in-memory cache by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4683
* [Feat] Allow safe memory mode by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4684
* [Fix] Proxy Return type=expire_key on expired Key errors by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4685


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.19...v1.41.20



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.20



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 157.2876965007789 | 6.441456374471405 | 0.0 | 1927 | 0 | 115.12158800002226 | 1529.9146329999758 |
| Aggregated | Passed ✅ | 130.0 | 157.2876965007789 | 6.441456374471405 | 0.0 | 1927 | 0 | 115.12158800002226 | 1529.9146329999758 |

1.41.19

Not secure
What's Changed
* Docs: Miscellaneous cleanup of `docs/my-website/docs/proxy/logging.md` by msabramo in https://github.com/BerriAI/litellm/pull/4651
* Proxy: Add `x-litellm-call-id` HTTP response header by msabramo in https://github.com/BerriAI/litellm/pull/4650
* Update Helicone Docs by colegottdank in https://github.com/BerriAI/litellm/pull/4612
* Helicone Headers & Cohere support by maamalama in https://github.com/BerriAI/litellm/pull/4607
* feat(vertex_httpx.py): Add seed parameter by Manouchehri in https://github.com/BerriAI/litellm/pull/4588
* Flag for PII masking on Logging only by krrishdholakia in https://github.com/BerriAI/litellm/pull/4669

New Contributors
* colegottdank made their first contribution in https://github.com/BerriAI/litellm/pull/4612
* maamalama made their first contribution in https://github.com/BerriAI/litellm/pull/4607

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.18...v1.41.19



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.19



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.19



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.19



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.19



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 162.29828288616181 | 6.400667074458538 | 0.0 | 1915 | 0 | 115.81288100001075 | 1323.0728789999944 |
| Aggregated | Passed ✅ | 140.0 | 162.29828288616181 | 6.400667074458538 | 0.0 | 1915 | 0 | 115.81288100001075 | 1323.0728789999944 |

1.41.18

Not secure
What's Changed
* [Fix] Model Hub - Show supports vision correctly by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4661
* Add missing space in "Failed Tracking Cost" Slack alert msg by msabramo in https://github.com/BerriAI/litellm/pull/4662
* Remove unnecessary imports by msabramo in https://github.com/BerriAI/litellm/pull/4647
* Fix: Add prisma binary_cache_dir specification to pyproject.toml by freinold in https://github.com/BerriAI/litellm/pull/4640
* Bump braces from 3.0.2 to 3.0.3 in /ui by dependabot in https://github.com/BerriAI/litellm/pull/4665
* [Fix] UI Allow setting custom model names for OpenAI compatible endpoints by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4667
* [UI-Fix] Setting router settings on ui by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4668
* Bump azure-identity from 1.16.0 to 1.16.1 by dependabot in https://github.com/BerriAI/litellm/pull/4666

New Contributors
* freinold made their first contribution in https://github.com/BerriAI/litellm/pull/4640

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.17...v1.41.18



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.18



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 156.4830850925827 | 6.353995699021305 | 0.0 | 1901 | 0 | 112.95200299997532 | 952.8679799999509 |
| Aggregated | Passed ✅ | 130.0 | 156.4830850925827 | 6.353995699021305 | 0.0 | 1901 | 0 | 112.95200299997532 | 952.8679799999509 |

1.41.17

Not secure
What's Changed
* Anthropic `/v1/messages` endpoint support by krrishdholakia in https://github.com/BerriAI/litellm/pull/4635
* Create helm package and move index.yaml file location by lowjiansheng in https://github.com/BerriAI/litellm/pull/4625
* [Proxy - OTEL] Fix logging DB, Redis Cache Reads by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4656
* Shorter success callbacks from `/health/readiness` by msabramo in https://github.com/BerriAI/litellm/pull/4652
* [Test-Proxy] Otel Traces by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4658
* Litellm linting fixes by krrishdholakia in https://github.com/BerriAI/litellm/pull/4663


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.15...v1.41.17



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.17



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 161.99401246125547 | 6.337949079570679 | 0.0 | 1897 | 0 | 118.68373999999449 | 1556.5389569999866 |
| Aggregated | Passed ✅ | 140.0 | 161.99401246125547 | 6.337949079570679 | 0.0 | 1897 | 0 | 118.68373999999449 | 1556.5389569999866 |

1.41.15

Not secure
What's Changed
* Add empower-functions integration to litellm by liuyl in https://github.com/BerriAI/litellm/pull/3955
* [Fix] Authentication on /thread endpoints on Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4627
* [Feat] Add support for `litellm.create_assistants` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4624
* ui - allow setting allowed ip addresses by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4632
* build(deps): bump zipp from 3.18.2 to 3.19.1 by dependabot in https://github.com/BerriAI/litellm/pull/4628
* [Feat-Proxy] Add DELETE /assistants by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4645
* [Feat] Add `litellm.delete_assistant` for OpenAI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4643
* [Feat] Add LIST, DELETE, GET `/files` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4648
* [Feat] Add GET /files endpoint by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4646
* [fix] slack alerting reports - add validation for safe access into attributes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4642

New Contributors
* liuyl made their first contribution in https://github.com/BerriAI/litellm/pull/3955

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.14...v1.41.15



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.15



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 120.0 | 136.08624143681106 | 6.374398266336339 | 0.0 | 1907 | 0 | 97.52214299999196 | 604.2804530000012 |
| Aggregated | Passed ✅ | 120.0 | 136.08624143681106 | 6.374398266336339 | 0.0 | 1907 | 0 | 97.52214299999196 | 604.2804530000012 |

Page 30 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.