Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 46 of 93

1.37.7

Not secure
What's Changed
* [Feat] send weekly spend reports by Team/Tag by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3609
* build(deps): bump next from 14.1.0 to 14.1.1 in /ui/litellm-dashboard by dependabot in https://github.com/BerriAI/litellm/pull/3550


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.6...v1.37.7



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.7


Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

1.37.6

Not secure
What's Changed
* [Feat] Use csv values for proxy batch completions (OpenAI Python compatible) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3587
* Add gpt-4o metadata by ConnorDoyle in https://github.com/BerriAI/litellm/pull/3613
* Update FastAPI to update starlette to fix warnings by msabramo in https://github.com/BerriAI/litellm/pull/3601

New Contributors
* ConnorDoyle made their first contribution in https://github.com/BerriAI/litellm/pull/3613

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.5-stable...v1.37.6

v1.37.5-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.5...v1.37.5-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.5-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

1.37.5

Not secure
What's Changed
* add additional models from openrouter by Merlinvt in https://github.com/BerriAI/litellm/pull/3545
* Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI) by Manouchehri in https://github.com/BerriAI/litellm/pull/3507
* Fix tool calls tracking with Lunary by vincelwt in https://github.com/BerriAI/litellm/pull/3424
* ✨ feat: Add Azure Content-Safety Proxy hooks by Lunik in https://github.com/BerriAI/litellm/pull/3407
* fix(exceptions.py): import openai Exceptions by nobu007 in https://github.com/BerriAI/litellm/pull/3399
* Clarifai-LiteLLM : Added clarifai as LLM Provider. by mogith-pn in https://github.com/BerriAI/litellm/pull/3369
* (fix) Fixed linting and other bugs with watsonx provider by simonsanvil in https://github.com/BerriAI/litellm/pull/3561
* feat(router.py): allow setting model_region in litellm_params by krrishdholakia in https://github.com/BerriAI/litellm/pull/3582
* [UI] Show Token ID/Hash on Admin UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3583
* [Litellm Proxy + litellm.Router] - Pass the same message/prompt to N models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3585
* [Feat] - log metadata on traces + allow users to log metadata when `existing_trace_id` exists by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3581
* Set fake env vars for `client_no_auth` fixture by msabramo in https://github.com/BerriAI/litellm/pull/3588
* [Feat] Proxy + Router - Retry on RateLimitErrors when fallbacks, other deployments exists by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3590
* Make `test_load_router_config` pass by msabramo in https://github.com/BerriAI/litellm/pull/3589
* feat(bedrock_httpx.py): Make Bedrock-Cohere calls Async + Command-R support by krrishdholakia in https://github.com/BerriAI/litellm/pull/3586

New Contributors
* Merlinvt made their first contribution in https://github.com/BerriAI/litellm/pull/3545
* mogith-pn made their first contribution in https://github.com/BerriAI/litellm/pull/3369

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.3-stable...v1.37.5



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat


v1.37.3-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.3...v1.37.3-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.3-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

1.37.3

Not secure
BETA support for Triton Inference Embeddings on 👉 Start here: https://docs.litellm.ai/docs/providers/triton-inference-server

⚡️ [Feat] Use Team based callbacks for failure_callbacks https://docs.litellm.ai/docs/proxy/team_based_routing#logging--caching

🛠️ [Test] Added Testing to ensure Proxy - uses the same OpenAI Client after 1 min

🛠️ [Fix] Upsert deployment bug on LiteLLM Proxy

🔥 Improved LiteLLM-stable load tests - added testing for Azure OpenAI, and using 50+ deployments on a proxy server

🚀 [Feat] support stream_options on litellm.text_completion

![codeimage-snippet_11 (3)](https://github.com/BerriAI/litellm/assets/29436595/a95ddd98-26b2-4b31-bd40-249549c7e35d)



What's Changed
* [Fix] Upsert deployment bug by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3569
* [Test] Proxy - uses the same OpenAI Client after 1 min by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3570
* [Feat] Use Team based callbacks with litellm.failure_callbacks by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3573
* enforce unique key and team aliases in the ui by powerhouseofthecell in https://github.com/BerriAI/litellm/pull/3572
* Huggingface classifier support by krrishdholakia in https://github.com/BerriAI/litellm/pull/3571
* [Feat] Add Triton Embeddings to LiteLLM by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3577
* fix(langfuse.py): fix logging user_id in trace param on new trace creation by krrishdholakia in https://github.com/BerriAI/litellm/pull/3576


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.2...v1.37.3



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.3



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

1.37.2

Not secure
What's Changed
* feat(proxy_server.py): return litellm version in response headers by krrishdholakia in https://github.com/BerriAI/litellm/pull/3535
* [Fix] `litellm.completion_cost(model="bedrock/anthropic.claude-instant-v1"..)` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3534
* [UI] show `End-User` Usage on Usage Tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3530
* Add support for async streaming to watsonx provider by simonsanvil in https://github.com/BerriAI/litellm/pull/3479
* feat(proxy_server.py): add CRUD endpoints for 'end_user' management by krrishdholakia in https://github.com/BerriAI/litellm/pull/3536
* Revert "Add support for async streaming to watsonx provider " by krrishdholakia in https://github.com/BerriAI/litellm/pull/3546
* [Feat] support `stream_options` param for OpenAI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3537
* Fix/error on get user role by powerhouseofthecell in https://github.com/BerriAI/litellm/pull/3551
* Globally filtering pydantic conflict warnings by CyanideByte in https://github.com/BerriAI/litellm/pull/3555
* [Feat] support `stream_options` on `litellm.text_completion` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3547
* feat(predibase.py): add support for predibase provider by krrishdholakia in https://github.com/BerriAI/litellm/pull/3552
* Expand access for other jwt algorithms by duckboy81 in https://github.com/BerriAI/litellm/pull/3378

New Contributors
* powerhouseofthecell made their first contribution in https://github.com/BerriAI/litellm/pull/3551
* duckboy81 made their first contribution in https://github.com/BerriAI/litellm/pull/3378

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.0.dev2_completion_cost...v1.37.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 24 | 28.59593037362605 | 1.5197959088318929 | 1.5197959088318929 | 455 | 455 | 22.671621000029063 | 184.80915000003506 |
| /health/liveliness | Failed ❌ | 23 | 27.673046850246536 | 15.568722485858137 | 15.568722485858137 | 4661 | 4661 | 21.451024999976198 | 1771.8764150000084 |
| /health/readiness | Failed ❌ | 23 | 28.361425038412307 | 15.652227755574176 | 15.652227755574176 | 4686 | 4686 | 21.433796999986043 | 1998.6570389999656 |
| Aggregated | Failed ❌ | 23 | 28.044976272087183 | 32.74074615026421 | 32.74074615026421 | 9802 | 9802 | 21.433796999986043 | 1998.6570389999656 |

1.37.0

Not secure
What's Changed
* Add support for async streaming to watsonx provider by simonsanvil in https://github.com/BerriAI/litellm/pull/3479
* feat(proxy_server.py): add CRUD endpoints for 'end_user' management by krrishdholakia in https://github.com/BerriAI/litellm/pull/3536
* Revert "Add support for async streaming to watsonx provider " by krrishdholakia in https://github.com/BerriAI/litellm/pull/3546
* [Feat] support `stream_options` param for OpenAI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3537


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.36.4-stable...v1.37.0



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.0



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 24 | 30.920653391666942 | 1.6032874745747854 | 1.6032874745747854 | 480 | 480 | 22.7277699999604 | 1106.2691980000068 |
| /health/liveliness | Failed ❌ | 23 | 27.53817710301093 | 15.531847409943232 | 15.531847409943232 | 4650 | 4650 | 21.782322000035492 | 1163.796681000008 |
| /health/readiness | Failed ❌ | 23 | 27.385915189429188 | 15.79906198903903 | 15.79906198903903 | 4730 | 4730 | 21.733759999960967 | 370.47772500000065 |
| Aggregated | Failed ❌ | 23 | 27.62979878326593 | 32.93419687355705 | 32.93419687355705 | 9860 | 9860 | 21.733759999960967 | 1163.796681000008 |

v1.36.4-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.36.4...v1.36.4-stable

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.36.4-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 240.44391799999687 | 240.44391799999687 | 0.6034212685832799 | 0.0 | 1 | 0 | 240.44391799999687 | 240.44391799999687 |
| /health/liveliness | Passed ✅ | 160.0 | 148.3316911111119 | 5.4307914172495195 | 0.0 | 9 | 0 | 61.81201000001124 | 230.1233220000114 |
| /health/readiness | Passed ✅ | 190.0 | 183.89320130769266 | 7.844476491582639 | 0.0 | 13 | 0 | 62.172574000001646 | 249.26089900000648 |
| Aggregated | Passed ✅ | 180.0 | 172.43655456521773 | 13.878689177415438 | 0.0 | 23 | 0 | 61.81201000001124 | 249.26089900000648 |

Page 46 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.