Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 49 of 93

1.35.33.dev4

What's Changed
* [UI] Polish viewing Model Latencies by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3380


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.33.dev3...1.35.33.dev4

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 78 | 86.64201385977066 | 1.4529460265759184 | 0.0 | 435 | 0 | 71.46099299995967 | 1225.4950830000553 |
| /health/liveliness | Passed ✅ | 62 | 65.89392386120038 | 15.64171549989661 | 0.003340105808220502 | 4683 | 1 | 59.578546000011556 | 1532.8349000000117 |
| /health/readiness | Passed ✅ | 62 | 65.34362750907113 | 15.464689892060925 | 0.0 | 4630 | 0 | 59.670154999992064 | 1263.970363999988 |
| Aggregated | Passed ✅ | 62 | 66.5584239677881 | 32.55935141853345 | 0.003340105808220502 | 9748 | 1 | 59.578546000011556 | 1532.8349000000117 |

1.35.33.dev1

What's Changed
* [UI] show exceptions by model deployments + model latencies - v0 by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3373
* [UI] Polish viewing Model Latencies by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3380
* changing ollama response parsing to expected behaviour by TheDiscoMole in https://github.com/BerriAI/litellm/pull/1526
* Added cost & context metadata for openrouter/anthropic/claude-3-opus by paul-gauthier in https://github.com/BerriAI/litellm/pull/3382
* fix - error sending details to log on sentry by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3384


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.33...v1.35.33.dev1

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 77 | 81.75372134047257 | 1.5592233014701338 | 0.0 | 467 | 0 | 70.36502400001154 | 514.3882039999994 |
| /health/liveliness | Passed ✅ | 61 | 63.30112607910893 | 15.278385069651675 | 0.0 | 4576 | 0 | 58.941096000012294 | 1077.0577769999932 |
| /health/readiness | Passed ✅ | 61 | 64.04839979952813 | 15.555506127514676 | 0.0 | 4659 | 0 | 59.04779399998006 | 1356.8010579999736 |
| Aggregated | Passed ✅ | 61 | 64.54817928983753 | 32.393114498636486 | 0.0 | 9702 | 0 | 58.941096000012294 | 1356.8010579999736 |

1.35.32

Not secure
What's Changed
* feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx by krrishdholakia in https://github.com/BerriAI/litellm/pull/3331
* protected_namespaces warning fixed for model_name & model_info by CyanideByte in https://github.com/BerriAI/litellm/pull/3334
* fix(utils.py): replicate now also has token based pricing for some models by krrishdholakia in https://github.com/BerriAI/litellm/pull/3354
* docs - update track cost with custom callbacks by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3359

New Contributors
* CyanideByte made their first contribution in https://github.com/BerriAI/litellm/pull/3334

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.31...v1.35.32

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 92 | 97.6487390778697 | 1.6300101877251756 | 0.0 | 488 | 0 | 85.23569400000497 | 1043.820304999997 |
| /health/liveliness | Passed ✅ | 77 | 81.78410904708275 | 15.394911793494536 | 0.010020554432736735 | 4609 | 3 | 73.86755400000311 | 1392.7269499999966 |
| /health/readiness | Passed ✅ | 78 | 79.75181014769441 | 14.994089616185068 | 0.003340184810912245 | 4489 | 1 | 74.02469199999473 | 1287.0916979999834 |
| Aggregated | Passed ✅ | 78 | 81.64003953901504 | 32.01901159740478 | 0.01336073924364898 | 9586 | 4 | 73.86755400000311 | 1392.7269499999966 |

1.35.32.dev1

What's Changed
* [Fix] Lowest Latency routing - random pick deployments when all latencies=0 by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3360


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.32...v1.35.32.dev1

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 56 | 60.03167063674528 | 1.6000338631233044 | 0.0 | 479 | 0 | 49.292497999999796 | 574.5523299999888 |
| /health/liveliness | Passed ✅ | 40 | 42.727974751455534 | 15.495943822607535 | 0.0 | 4639 | 0 | 37.59355699997968 | 1116.836471000056 |
| /health/readiness | Passed ✅ | 40 | 42.65787459895715 | 15.37569075565046 | 0.0 | 4603 | 0 | 37.80075800000304 | 1220.3729720000354 |
| Aggregated | Passed ✅ | 41 | 43.54741712642739 | 32.4716684413813 | 0.0 | 9721 | 0 | 37.59355699997968 | 1220.3729720000354 |

1.35.31

Not secure
What's Changed
* Fix Anthropic Messages Prompt Template function to add a third condition: list of text-content dictionaries by eercanayar in https://github.com/BerriAI/litellm/pull/2780
* add safety_settings parameters to Vertex AI async mode by hellof20 in https://github.com/BerriAI/litellm/pull/3312
* (feat) add IBM watsonx.ai as an llm provider by simonsanvil in https://github.com/BerriAI/litellm/pull/3270
* fix: incorrect analysis for vertex ai env vars causing confusion by suptejas in https://github.com/BerriAI/litellm/pull/3324
* Revert "Fix Anthropic Messages Prompt Template function to add a third condition: list of text-content dictionaries" by krrishdholakia in https://github.com/BerriAI/litellm/pull/3328
* Add watsonx to list of model providers and fixed typo in colab notebook by simonsanvil in https://github.com/BerriAI/litellm/pull/3326
* fix(router.py): fix default retry logic by krrishdholakia in https://github.com/BerriAI/litellm/pull/3302
* [Feat] Redact Logging Messages/Response content on Logging Providers with `litellm.turn_off_message_logging=True` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3330
* Fix - Admin UI stops working if proxy budget has been exceeded by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3335
* docs - alerting by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3337
* UI - fix bug showing models to pic by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3339
* [Fix] - Link Langfuse Traces to Slack Alerts by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3336
* Fix - slack alerting show deployment latencies in sorted order by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3338

New Contributors
* eercanayar made their first contribution in https://github.com/BerriAI/litellm/pull/2780
* hellof20 made their first contribution in https://github.com/BerriAI/litellm/pull/3312
* simonsanvil made their first contribution in https://github.com/BerriAI/litellm/pull/3270
* suptejas made their first contribution in https://github.com/BerriAI/litellm/pull/3324

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.30...v1.35.31

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 40 | 52.40982617634495 | 1.6101070107145294 | 0.0 | 482 | 0 | 34.925257999987025 | 1028.7042879999717 |
| /health/liveliness | Passed ✅ | 25 | 27.778927652744667 | 15.75700159655692 | 0.003340470976586161 | 4717 | 1 | 23.23458099999698 | 1168.9134119999949 |
| /health/readiness | Passed ✅ | 25 | 27.787623363912648 | 15.402911673038787 | 0.0 | 4611 | 0 | 23.29059499993491 | 957.4790820000203 |
| Aggregated | Passed ✅ | 25 | 28.993218071967036 | 32.77002028031024 | 0.003340470976586161 | 9810 | 1 | 23.23458099999698 | 1168.9134119999949 |

1.35.30

Not secure
What's Changed
* docs improvement - deploying litellm-database by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3317


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.29...v1.35.30

Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 77 | 90.17568118244809 | 1.4462740106743606 | 0.0 | 433 | 0 | 71.0471599999778 | 1333.5664809999912 |
| /health/liveliness | Passed ✅ | 61 | 65.05894558607277 | 15.444736779118344 | 0.0033401247359685 | 4624 | 1 | 58.92467099999976 | 1412.1900170000004 |
| /health/readiness | Passed ✅ | 61 | 64.18023432293225 | 15.307791664943636 | 0.0 | 4583 | 0 | 59.158654000015076 | 1329.4413670000154 |
| Aggregated | Passed ✅ | 61 | 65.76936185103749 | 32.19880245473634 | 0.0033401247359685 | 9640 | 1 | 58.92467099999976 | 1412.1900170000004 |

Page 49 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.