Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 93

1.49.1.dev2

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.49.0.dev8...v1.49.1.dev2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.49.1.dev2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 240.0 | 269.9721247181889 | 6.118754433480718 | 0.0 | 1831 | 0 | 213.69175899997117 | 1557.681434000017 |
| Aggregated | Passed ✅ | 240.0 | 269.9721247181889 | 6.118754433480718 | 0.0 | 1831 | 0 | 213.69175899997117 | 1557.681434000017 |

1.49.1.dev1

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.49.1...v1.49.1.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.49.1.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 150.0 | 179.47601920564753 | 6.271742200144674 | 0.0 | 1877 | 0 | 123.88641099994402 | 1308.2563159999836 |
| Aggregated | Passed ✅ | 150.0 | 179.47601920564753 | 6.271742200144674 | 0.0 | 1877 | 0 | 123.88641099994402 | 1308.2563159999836 |

1.49.0

🚨 LiteLLM Proxy DB Schema updated - new table `LiteLLM_OrganizationMembership` created

What's Changed
* (fix) clean up root repo - move entrypoint.sh and build_admin_ui to /docker by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6110
* (fix) Fix Groq pricing for llama3.1 by kiriloman in https://github.com/BerriAI/litellm/pull/6114
* Fix: Literal AI llm completion logging by willydouhard in https://github.com/BerriAI/litellm/pull/6096
* LiteLLM Minor Fixes & Improvements (10/08/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6119
* (feat proxy) [beta] add support for organization role based access controls by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6112


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.19...v1.49.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.49.0



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 89 | 103.77355238552961 | 6.468497554569813 | 0.0 | 1935 | 0 | 71.25175699997044 | 2212.39985699998 |
| Aggregated | Passed ✅ | 89 | 103.77355238552961 | 6.468497554569813 | 0.0 | 1935 | 0 | 71.25175699997044 | 2212.39985699998 |

v1.48.19-stable
What's Changed
* [docs] fix links due to broken list in enterprise features by pradhyumna85 in https://github.com/BerriAI/litellm/pull/6103
* (docs) key based callbacks - add info on behavior by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6107
* (docs) add remaining litellm settings on configs.md doc by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6108
* (clean up) move docker files from root to `docker` folder by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6109
* LiteLLM Minor Fixes & Improvements (10/07/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6101


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.18...v1.48.19-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.19-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 160.0 | 190.24152288506397 | 6.308463142159989 | 0.0 | 1888 | 0 | 125.76214700004584 | 1885.6179009999892 |
| Aggregated | Passed ✅ | 160.0 | 190.24152288506397 | 6.308463142159989 | 0.0 | 1888 | 0 | 125.76214700004584 | 1885.6179009999892 |

1.49.0.dev8

What's Changed
* (bug fix proxy ui) Default Team still rendered Even when disabled by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6134
* LiteLLM Minor Fixes & Improvements (10/09/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6139
* (feat) use regex pattern matching for wildcard routing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6150
* [Feat] Observability integration - Opik by Comet by jverre in https://github.com/BerriAI/litellm/pull/6062

New Contributors
* jverre made their first contribution in https://github.com/BerriAI/litellm/pull/6062

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.49.0...v1.49.0.dev8



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.49.0.dev8



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 170.0 | 201.69235478777898 | 6.124836962754249 | 0.003341427693810283 | 1833 | 1 | 137.23228399999243 | 2469.3121389999533 |
| Aggregated | Passed ✅ | 170.0 | 201.69235478777898 | 6.124836962754249 | 0.003341427693810283 | 1833 | 1 | 137.23228399999243 | 2469.3121389999533 |

1.49.0.dev1

What's Changed
* (bug fix proxy ui) Default Team still rendered Even when disabled by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6134


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.49.0...v1.49.0.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.49.0.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 158.63853836566005 | 6.304478702601712 | 0.0 | 1887 | 0 | 114.53757900005712 | 970.1067440000202 |
| Aggregated | Passed ✅ | 130.0 | 158.63853836566005 | 6.304478702601712 | 0.0 | 1887 | 0 | 114.53757900005712 | 970.1067440000202 |

v1.49.0-stable

🚨 LiteLLM Proxy DB Schema updated - new table `LiteLLM_OrganizationMembership` created

What's Changed
* (fix) clean up root repo - move entrypoint.sh and build_admin_ui to /docker by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6110
* (fix) Fix Groq pricing for llama3.1 by kiriloman in https://github.com/BerriAI/litellm/pull/6114
* Fix: Literal AI llm completion logging by willydouhard in https://github.com/BerriAI/litellm/pull/6096
* LiteLLM Minor Fixes & Improvements (10/08/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6119
* (feat proxy) [beta] add support for organization role based access controls by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6112


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.19...v1.49.0-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.49.0-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 160.0 | 191.82647967376343 | 6.27807846892522 | 0.0 | 1879 | 0 | 127.06767200000968 | 2462.4399449999714 |
| Aggregated | Passed ✅ | 160.0 | 191.82647967376343 | 6.27807846892522 | 0.0 | 1879 | 0 | 127.06767200000968 | 2462.4399449999714 |

1.48.19

What's Changed
* [docs] fix links due to broken list in enterprise features by pradhyumna85 in https://github.com/BerriAI/litellm/pull/6103
* (docs) key based callbacks - add info on behavior by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6107
* (docs) add remaining litellm settings on configs.md doc by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6108
* (clean up) move docker files from root to `docker` folder by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6109
* LiteLLM Minor Fixes & Improvements (10/07/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6101


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.18...v1.48.19



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.19



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 170.0 | 197.18394088461483 | 6.170329569339292 | 0.0 | 1846 | 0 | 127.89470899997468 | 5195.441569000024 |
| Aggregated | Passed ✅ | 170.0 | 197.18394088461483 | 6.170329569339292 | 0.0 | 1846 | 0 | 127.89470899997468 | 5195.441569000024 |

Page 7 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.