Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 93

1.48.0

What's Changed
* [Testing-Proxy] Add E2E Admin UI testing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5845
* [UI Fix] List all teams on UI when user is Admin by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5851
* [Feat] SSO - add `provider` in the OpenID field for custom sso by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5849
* [Feat-Proxy] add service accounts backend by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5852
* [Feat UI sso] store 'provider' in user metadata by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5856
* [Feat] Admin UI - Add Service Accounts by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5855
* [Docker-Security Fix] - handle debian issue on docker builds by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5752


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.47.2...v1.48.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.0



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 150.0 | 175.2223238286623 | 6.325709388512502 | 0.0 | 1891 | 0 | 120.55964699999322 | 5241.7233159999905 |
| Aggregated | Passed ✅ | 150.0 | 175.2223238286623 | 6.325709388512502 | 0.0 | 1891 | 0 | 120.55964699999322 | 5241.7233159999905 |

1.48.0.dev1

What's Changed
* Update the dockerignore file by Jacobh2 in https://github.com/BerriAI/litellm/pull/5863
* [Admin UI - Proxy] Add Deepseek as a provider by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5857

New Contributors
* Jacobh2 made their first contribution in https://github.com/BerriAI/litellm/pull/5863

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.0...v1.48.0.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.0.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 160.0 | 183.12276772148553 | 6.304506140414986 | 0.0 | 1885 | 0 | 124.05826999997771 | 2527.3185839999996 |
| Aggregated | Passed ✅ | 160.0 | 183.12276772148553 | 6.304506140414986 | 0.0 | 1885 | 0 | 124.05826999997771 | 2527.3185839999996 |

1.47.2

What's Changed
* LiteLLM Minor Fixes & Improvements (09/21/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5819
* Cost tracking improvements by krrishdholakia in https://github.com/BerriAI/litellm/pull/5828


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.47.1...v1.47.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.47.2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 120.0 | 141.35987647178743 | 6.393635005988893 | 0.0 | 1914 | 0 | 86.15648299996792 | 2773.7144609999973 |
| Aggregated | Passed ✅ | 120.0 | 141.35987647178743 | 6.393635005988893 | 0.0 | 1914 | 0 | 86.15648299996792 | 2773.7144609999973 |

1.47.2.dev5

What's Changed
* [Feat] SSO - add `provider` in the OpenID field for custom sso by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5849
* [Feat-Proxy] add service accounts backend by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5852


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.47.2.dev4...v1.47.2.dev5



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.47.2.dev5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 93 | 110.42370158450228 | 6.42332049679032 | 0.0 | 1923 | 0 | 74.2026180000721 | 2848.044042999959 |
| Aggregated | Passed ✅ | 93 | 110.42370158450228 | 6.42332049679032 | 0.0 | 1923 | 0 | 74.2026180000721 | 2848.044042999959 |

1.47.2.dev4

What's Changed
* [Testing-Proxy] Add E2E Admin UI testing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5845
* [UI Fix] List all teams on UI when user is Admin by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5851


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.47.2...v1.47.2.dev4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.47.2.dev4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 93 | 107.70167930944973 | 6.44480680494056 | 0.0 | 1926 | 0 | 71.48945900007675 | 1084.4464099999982 |
| Aggregated | Passed ✅ | 93 | 107.70167930944973 | 6.44480680494056 | 0.0 | 1926 | 0 | 71.48945900007675 | 1084.4464099999982 |

1.47.2.dev1

What's Changed
* [Testing-Proxy] Add E2E Admin UI testing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5845


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.47.2...v1.47.2.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.47.2.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 160.0 | 189.30608518545907 | 6.342030419668088 | 0.0 | 1898 | 0 | 122.74810899998556 | 2604.24186299997 |
| Aggregated | Passed ✅ | 160.0 | 189.30608518545907 | 6.342030419668088 | 0.0 | 1898 | 0 | 122.74810899998556 | 2604.24186299997 |

Page 12 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.