Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 28 of 93

1.42.1

Not secure
What's Changed
* Add Llama 3.1 for Bedrock by Manouchehri in https://github.com/BerriAI/litellm/pull/4848
* (test_embedding.py) - Re-enable embedding test with Azure OIDC. by Manouchehri in https://github.com/BerriAI/litellm/pull/4857
* [Feat] - Support Logging tags on langsmith by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4853
* [Fix-litellm python] Raise correct error for UnsupportedParams Error by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4862
* doc example using litellm proxy with groq by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4864
* feat: add support for friendliai dedicated endpoint by pocca2048 in https://github.com/BerriAI/litellm/pull/4638
* [Feat] Add Groq/llama3.1 by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4871


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.42.0...v1.42.1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 110.0 | 131.39329430193612 | 6.3862073107191915 | 0.0 | 1911 | 0 | 96.32532500006619 | 1137.7997399999913 |
| Aggregated | Passed ✅ | 110.0 | 131.39329430193612 | 6.3862073107191915 | 0.0 | 1911 | 0 | 96.32532500006619 | 1137.7997399999913 |

v1.42.0-stable
What's Changed
* Add Llama 3.1 for Bedrock by Manouchehri in https://github.com/BerriAI/litellm/pull/4848
* (test_embedding.py) - Re-enable embedding test with Azure OIDC. by Manouchehri in https://github.com/BerriAI/litellm/pull/4857
* [Feat] - Support Logging tags on langsmith by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4853


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.42.0...v1.42.0-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.0-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 100.0 | 121.24051527541529 | 6.441002345574648 | 0.0 | 1928 | 0 | 82.52180400000952 | 2385.7228059999898 |
| Aggregated | Passed ✅ | 100.0 | 121.24051527541529 | 6.441002345574648 | 0.0 | 1928 | 0 | 82.52180400000952 | 2385.7228059999898 |

1.42.0

Not secure
What's Changed
* Add test for Azure OIDC auth by Manouchehri in https://github.com/BerriAI/litellm/pull/4839
* feat(vertex_ai_llama.py): vertex ai llama3.1 api support by krrishdholakia in https://github.com/BerriAI/litellm/pull/4845
* Check existence of multiple views in 1 query by msabramo in https://github.com/BerriAI/litellm/pull/4846
* feat - Add Azure_AI Llama v3.1 API deployments to the model prices json file by elabbarw in https://github.com/BerriAI/litellm/pull/4843
* OIDC azure tests 3 by Manouchehri in https://github.com/BerriAI/litellm/pull/4854
* (test_secret_manager.py) - Improve and add CircleCI v1 test with Amazon. by Manouchehri in https://github.com/BerriAI/litellm/pull/4855
* Fix `test_prompt_factory` flake8 warning by msabramo in https://github.com/BerriAI/litellm/pull/4856


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.28...v1.42.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.42.0



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 166.95795460392316 | 6.303633980794741 | 0.0 | 1886 | 0 | 114.9558199999774 | 3113.0540860000056 |
| Aggregated | Passed ✅ | 140.0 | 166.95795460392316 | 6.303633980794741 | 0.0 | 1886 | 0 | 114.9558199999774 | 3113.0540860000056 |

1.41.28

Not secure
What's Changed
* feat(auth_checks.py): Allow admin to disable team from turning on/off guardrails by krrishdholakia in https://github.com/BerriAI/litellm/pull/4810
* Braintrust logging integration by krrishdholakia in https://github.com/BerriAI/litellm/pull/4830
* [Feat-Proxy] Disable Logging per Team by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4835
* Fix + Docs - slack alerting separate alerts by webhook url by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4837
* Litellm triton chatcompletion support - Resubmit of 3895 by giritatavarty-8451 in https://github.com/BerriAI/litellm/pull/3905
* (docs): Add OIDC doc. by Manouchehri in https://github.com/BerriAI/litellm/pull/4836
* [Fix-Proxy] Spend Tracking - accept null values for api_base (optional fields) in SpendLogs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4844
* [Feat] - /v1/messages support usage tracking on spendLogs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4841
* doc - using anthropic with litellm proxy server by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4838


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.27...v1.41.28



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.28



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 120.0 | 145.0982082533176 | 6.291604409024448 | 0.0 | 1883 | 0 | 102.05236299998433 | 2429.1477409999516 |
| Aggregated | Passed ✅ | 120.0 | 145.0982082533176 | 6.291604409024448 | 0.0 | 1883 | 0 | 102.05236299998433 | 2429.1477409999516 |

1.41.27

Not secure
What's Changed
* [Fix-Proxy] Allow non admin keys to access /v1/messages Anthropic Routes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4827
* [Feat] Add Arize AI callbacks on LiteLLM by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4826
* Fix errors with docker-compose file by elabbarw in https://github.com/BerriAI/litellm/pull/4821
* fix raise correct provider on streaming content policy violation errors by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4828
* [Feat] - API Endpoints to control logging callbacks per Team by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4831

New Contributors
* elabbarw made their first contribution in https://github.com/BerriAI/litellm/pull/4821

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.26...v1.41.27



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.27



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 146.5514615299953 | 6.406028445312135 | 0.0 | 1917 | 0 | 93.37116099999321 | 1550.851582000007 |
| Aggregated | Passed ✅ | 130.0 | 146.5514615299953 | 6.406028445312135 | 0.0 | 1917 | 0 | 93.37116099999321 | 1550.851582000007 |

1.41.26

Not secure
What's Changed
* fix(utils.py): support dynamic params for openai-compatible providers by krrishdholakia in https://github.com/BerriAI/litellm/pull/4801
* fix(factory.py): refactor factory to use httpx client by krrishdholakia in https://github.com/BerriAI/litellm/pull/4796
* docs - show to do spend tracking with OpenAI Js + Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4805
* fix(user_api_key_auth.py): update valid token cache with updated team object cache by krrishdholakia in https://github.com/BerriAI/litellm/pull/4799
* feat - add mistral `open-codestral-mamba` `open-mistral-nemo` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4808
* fix(openai.py): drop invalid params if `drop_params: true` for azure ai by krrishdholakia in https://github.com/BerriAI/litellm/pull/4806
* [Ui] add together AI, Mistral, PerplexityAI, OpenRouter models on Admin UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4802
* router - use verbose logger when using litellm.Router by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4809
* Revert "[Ui] add together AI, Mistral, PerplexityAI, OpenRouter models on Admin UI " by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4811
* [Feat] Return response headers on `litellm.completion` , `litellm.embedding` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4807
* Fix: use Bedrock region from environment variables before other region definitions by petermuller in https://github.com/BerriAI/litellm/pull/4613
* Revert "Fix: use Bedrock region from environment variables before other region definitions" by krrishdholakia in https://github.com/BerriAI/litellm/pull/4819


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.25...v1.41.26



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.26



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 100.0 | 122.89826346409966 | 6.42178976324838 | 0.0 | 1922 | 0 | 84.74049599999489 | 2107.5484990000177 |
| Aggregated | Passed ✅ | 100.0 | 122.89826346409966 | 6.42178976324838 | 0.0 | 1922 | 0 | 84.74049599999489 | 2107.5484990000177 |

1.41.26.dev1

What's Changed
* [Fix-Proxy] Allow non admin keys to access /v1/messages Anthropic Routes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4827
* [Feat] Add Arize AI callbacks on LiteLLM by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4826


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.41.26...v1.41.26.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.26.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 79 | 88.6102844019309 | 6.574606664855584 | 0.0 | 1968 | 0 | 65.93751000002612 | 1205.6313560000262 |
| Aggregated | Passed ✅ | 79 | 88.6102844019309 | 6.574606664855584 | 0.0 | 1968 | 0 | 65.93751000002612 | 1205.6313560000262 |

Page 28 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.