Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 53 of 93

1.35.16

Not secure
What's Changed
* (feat) Langfuse - Add location logging, and add cache_hit to metadata. by Manouchehri in https://github.com/BerriAI/litellm/pull/2961
* [FEAT] Add `groq/llama3` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3176
* [UI] Show teams as dropdown in invite user flow by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3180
* [FEAT] Log team alias to langfuse by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3175
* fix: Stream completion responses from anthropic. (Fix 3129) by jmandel in https://github.com/BerriAI/litellm/pull/3174
* [Fix] - Langfuse log proxy_base_url to langfuse as a tag (if set by user) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3183
* fix(caching.py): dual cache async_batch_get_cache fix + testing by krrishdholakia in https://github.com/BerriAI/litellm/pull/3179
* fix(caching.py): fix redis url parsing logic to work with ssl urls by krrishdholakia in https://github.com/BerriAI/litellm/pull/3173

![codeimage-snippet_20 (1)](https://github.com/BerriAI/litellm/assets/29436595/74de84a9-f331-4b52-869f-12d75df250c4)


New Contributors
* jmandel made their first contribution in https://github.com/BerriAI/litellm/pull/3174

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.15...v1.35.16

Don't want to maintain your internal proxy? get in touch πŸŽ‰Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 77 | 85.19056921350604 | 1.5329734178477303 | 0.0 | 459 | 0 | 70.44526700002507 | 1436.744482999984 |
| /health/liveliness | Passed βœ… | 61 | 65.12887882126974 | 15.042510400841344 | 0.0 | 4504 | 0 | 58.82374399999435 | 1409.0413549999994 |
| /health/readiness | Passed βœ… | 61 | 64.03151302740865 | 15.596919087906102 | 0.003339811367859979 | 4670 | 1 | 59.10709400001224 | 1196.9403539999917 |
| Aggregated | Passed βœ… | 61 | 65.55279843433988 | 32.172402906595174 | 0.003339811367859979 | 9633 | 1 | 58.82374399999435 | 1436.744482999984 |

v1.35.15-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.20...v1.35.15-stable

Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat



Don't want to maintain your internal proxy? get in touch πŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 43 | 49.55053653586523 | 1.583453330366775 | 0.0 | 474 | 0 | 35.98067499999047 | 1311.9795399999816 |
| /health/liveliness | Passed βœ… | 26 | 28.40283839727455 | 15.690886693528993 | 0.0 | 4697 | 0 | 23.32956700001887 | 1245.456331000014 |
| /health/readiness | Passed βœ… | 26 | 29.229367997202925 | 15.527196370347617 | 0.0 | 4648 | 0 | 23.27850199998238 | 1275.4970010000193 |
| Aggregated | Passed βœ… | 26 | 29.814969825949255 | 32.801536394243385 | 0.0 | 9819 | 0 | 23.27850199998238 | 1311.9795399999816 |

1.35.15

Not secure
What's Changed
* usage based routing v2 improvements - unit testing + *NEW* async + sync 'pre_call_checks' by krrishdholakia in https://github.com/BerriAI/litellm/pull/3153


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.14...v1.35.15

Don't want to maintain your internal proxy? get in touch πŸŽ‰Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 61 | 64.26716269503412 | 1.4128166144037744 | 0.0 | 423 | 0 | 55.82698000000619 | 518.0516049999824 |
| /health/liveliness | Passed βœ… | 45 | 47.47650433469414 | 15.547662742433971 | 0.0 | 4655 | 0 | 42.33338000000231 | 1026.14588900002 |
| /health/readiness | Passed βœ… | 45 | 47.47785031392411 | 15.544322750437745 | 0.0 | 4654 | 0 | 42.32649900001206 | 1229.5602290000716 |
| Aggregated | Passed βœ… | 45 | 48.206951588471156 | 32.50480210727549 | 0.0 | 9732 | 0 | 42.32649900001206 | 1229.5602290000716 |

1.35.14

Not secure
**Full Changelog**: https://github.com/BerriAI/litellm/compare/1.35.13.dev1...v1.35.14

**Don't want to maintain your internal proxy? get in touch πŸŽ‰**

Hosted Internal Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 61 | 69.02272031568172 | 1.639857453056782 | 0.0 | 491 | 0 | 54.20590699998229 | 1289.3331660000058 |
| /health/liveliness | Passed βœ… | 45 | 48.200781280805096 | 15.74730731397704 | 0.0 | 4715 | 0 | 42.47953600008714 | 1189.913727999965 |
| /health/readiness | Passed βœ… | 45 | 48.80195884177193 | 15.303109673943329 | 0.0 | 4582 | 0 | 42.55265799997687 | 1153.1860230000461 |
| Aggregated | Passed βœ… | 45 | 49.52670768563471 | 32.69027444097715 | 0.0 | 9788 | 0 | 42.47953600008714 | 1289.3331660000058 |

1.35.13

Not secure
What's Changed
* fix(_types.py): hash api key in UserAPIKeyAuth by krrishdholakia in https://github.com/BerriAI/litellm/pull/3105
* Fix missing comma in Gemini document. by kittinan in https://github.com/BerriAI/litellm/pull/3123
* [FIX] - show vertex_project, vertex_location in Vertex AI exceptions by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3130
* [Fix] Slack Alerting - trim messages to first 100 chars by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3140
* [Fix] Show `model` passed on `"400: {'error': 'Invalid model name passed in mode` errors πŸ‘» by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3142
* (feat) - Add seed to Cohere Chat. by Manouchehri in https://github.com/BerriAI/litellm/pull/3136
* [Feat] Allow user to select slack alert types to Opt In to by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3112

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Internal Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat


New Contributors
* kittinan made their first contribution in https://github.com/BerriAI/litellm/pull/3123

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.12...v1.35.13

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 93 | 98.68926357872152 | 1.5698647166764488 | 0.0 | 470 | 0 | 86.12066500000992 | 777.2846080000022 |
| /health/liveliness | Passed βœ… | 78 | 80.06604596943863 | 15.301170781052791 | 0.0066802753901125484 | 4581 | 2 | 73.94863899997972 | 1522.7241400000366 |
| /health/readiness | Passed βœ… | 78 | 80.77921167615287 | 15.140844171690091 | 0.0 | 4533 | 0 | 73.96055500004195 | 1365.2821910000057 |
| Aggregated | Passed βœ… | 78 | 81.31663992028369 | 32.01187966941933 | 0.0066802753901125484 | 9584 | 2 | 73.94863899997972 | 1522.7241400000366 |

1.35.13.dev1

What's Changed
* [UI]- show user_emails in Users Tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3146
* feat(prometheus_services.py): emit proxy latency for successful llm api requests by krrishdholakia in https://github.com/BerriAI/litellm/pull/3144
* [UI] - Models Page - Place litellm params in an accordion by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3149
* ui - show key_aliases on `Users` Tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3148
* [UI] View all alert types by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3151
* [Fix] show api base hanging request alerts by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3152

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Internal Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.13...1.35.13.dev1

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 93 | 99.6442109758241 | 1.5199391645966214 | 0.0 | 455 | 0 | 86.28510800002687 | 664.7396900000047 |
| /health/liveliness | Passed βœ… | 78 | 80.74724332925449 | 15.28958583815107 | 0.0 | 4577 | 0 | 73.75669200001767 | 1310.6672600000024 |
| /health/readiness | Passed βœ… | 78 | 80.6090902937474 | 15.386461081608875 | 0.0 | 4606 | 0 | 73.75481000002537 | 1012.9456880000021 |
| Aggregated | Passed βœ… | 78 | 81.57332627152928 | 32.19598608435656 | 0.0 | 9638 | 0 | 73.75481000002537 | 1310.6672600000024 |

1.35.12

Not secure
What's Changed
* fix(vertex_ai.py): fix faulty async call tool calling check by krrishdholakia in https://github.com/BerriAI/litellm/pull/3102
* Support for Claude 3 Opus on vertex_ai by Dev-Khant in https://github.com/BerriAI/litellm/pull/3026
* [FIX} Repeat Slack Alerts triggered for "User Crossed Budget" by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3114

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.11...v1.35.12

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed βœ… | 81 | 90.18032295762929 | 1.5764885124209485 | 0.0 | 472 | 0 | 75.0108780000005 | 1097.2126149999895 |
| /health/liveliness | Passed βœ… | 66 | 68.85966111970596 | 15.457603465008791 | 0.0 | 4628 | 0 | 63.27401099997587 | 1008.0632059999743 |
| /health/readiness | Passed βœ… | 66 | 69.48201319830875 | 15.006701030312122 | 0.003340018034790145 | 4493 | 1 | 63.539215999981025 | 1354.8842850000256 |
| Aggregated | Passed βœ… | 66 | 70.20017819222355 | 32.04079300774186 | 0.003340018034790145 | 9593 | 1 | 63.27401099997587 | 1354.8842850000256 |

Page 53 of 93

Links

Releases

Has known vulnerabilities

Β© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.