Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 31 of 93

1.41.14

Not secure
What's Changed
* Enable `allowed_ip's` for proxy by krrishdholakia in https://github.com/BerriAI/litellm/pull/4615


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.13...v1.41.14



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.14



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.14



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 100.0 | 134.38986152953535 | 6.392866857345002 | 0.0 | 1913 | 0 | 83.7405220000278 | 2768.6006659999975 |
| Aggregated | Passed ✅ | 100.0 | 134.38986152953535 | 6.392866857345002 | 0.0 | 1913 | 0 | 83.7405220000278 | 2768.6006659999975 |

1.41.14.dev15

What's Changed
* [Feat-Proxy] Add DELETE /assistants by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4645
* [Feat] Add `litellm.delete_assistant` for OpenAI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4643


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.14.dev10...1.41.14.dev15



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-1.41.14.dev15



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 176.17954028961256 | 6.367851179084963 | 0.0 | 1906 | 0 | 118.41156200000569 | 2945.909352000001 |
| Aggregated | Passed ✅ | 140.0 | 176.17954028961256 | 6.367851179084963 | 0.0 | 1906 | 0 | 118.41156200000569 | 2945.909352000001 |

1.41.14.dev10

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.14.dev8...v1.41.14.dev10



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.14.dev10



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 78 | 93.67940596155863 | 6.51911295062805 | 0.0 | 1951 | 0 | 67.06418299995676 | 1762.6460930000007 |
| Aggregated | Passed ✅ | 78 | 93.67940596155863 | 6.51911295062805 | 0.0 | 1951 | 0 | 67.06418299995676 | 1762.6460930000007 |

1.41.14.dev8

What's Changed
* Add empower-functions integration to litellm by liuyl in https://github.com/BerriAI/litellm/pull/3955
* [Fix] Authentication on /thread endpoints on Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4627
* [Feat] Add support for `litellm.create_assistants` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4624
* ui - allow setting allowed ip addresses by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4632
* build(deps): bump zipp from 3.18.2 to 3.19.1 by dependabot in https://github.com/BerriAI/litellm/pull/4628

New Contributors
* liuyl made their first contribution in https://github.com/BerriAI/litellm/pull/3955

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.14...v1.41.14.dev8



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.14.dev8



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 155.87585467019107 | 6.323476190060047 | 0.0 | 1892 | 0 | 111.48930899997822 | 3038.6415779999825 |
| Aggregated | Passed ✅ | 130.0 | 155.87585467019107 | 6.323476190060047 | 0.0 | 1892 | 0 | 111.48930899997822 | 3038.6415779999825 |

1.41.13

Not secure
What's Changed
* [Fix - Proxy] Raise `type=ProxyErrorTypes.budget_exceeded,` on Exceeded budget errors by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4606
* feat(httpx): Send litellm user-agent version upstream by Manouchehri in https://github.com/BerriAI/litellm/pull/4591
* fix(utils.py): change update to upsert by andresrguzman in https://github.com/BerriAI/litellm/pull/4610
* [Proxy-Fix]: Add /assistants, /threads as OpenAI routes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4611
* UI fixes - Send custom llm provider when adding a new model by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4609

New Contributors
* andresrguzman made their first contribution in https://github.com/BerriAI/litellm/pull/4610

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.12...v1.41.13



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.13



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 155.5983915135419 | 6.414233412452233 | 0.0 | 1920 | 0 | 116.54234000002361 | 1700.940350000053 |
| Aggregated | Passed ✅ | 140.0 | 155.5983915135419 | 6.414233412452233 | 0.0 | 1920 | 0 | 116.54234000002361 | 1700.940350000053 |

1.41.12

Not secure
What's Changed
* fix(vertex_httpx.py): support tool calling w/ streaming for vertex ai + gemini by krrishdholakia in https://github.com/BerriAI/litellm/pull/4579
* fix(router.py): fix setting httpx mounts by krrishdholakia in https://github.com/BerriAI/litellm/pull/4434
* Fix bugs with watsonx embedding/async endpoints by simonsanvil in https://github.com/BerriAI/litellm/pull/4586
* fix - setting rpm/tpm on proxy through admin ui by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4599
* Update helm chart version by lowjiansheng in https://github.com/BerriAI/litellm/pull/4590
* [Enterprise-Feature: Proxy] Track user-ip address in requests & in LiteLLM_SpendLogs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4603


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.11...v1.41.12



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.12



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 174.2114751289816 | 6.295675154974679 | 0.0 | 1884 | 0 | 119.434291999994 | 1664.4424330000334 |
| Aggregated | Passed ✅ | 140.0 | 174.2114751289816 | 6.295675154974679 | 0.0 | 1884 | 0 | 119.434291999994 | 1664.4424330000334 |

Page 31 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.