Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 38 of 93

1.40.14.dev1

What's Changed
* [Fix] Redacting messages from OTEL + Refactor `utils.py` to use `litellm_core_utils` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4176


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.10...v1.40.14.dev1

1.40.13

Not secure
What's Changed
* fix(parallel_request_limiter.py): rate limit keys across instances by krrishdholakia in https://github.com/BerriAI/litellm/pull/4150
* Langfuse integration support for `parent_observation_id` parameter by hburrichter in https://github.com/BerriAI/litellm/pull/3559

New Contributors
* hburrichter made their first contribution in https://github.com/BerriAI/litellm/pull/3559

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.12...v1.40.13



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.13



Don't want to maintain your internal proxy? get in touch ๐ŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed โœ… | 74 | 85.12421177852299 | 6.470441187117138 | 0.0 | 1937 | 0 | 63.80303100002038 | 1377.5951729999178 |
| Aggregated | Passed โœ… | 74 | 85.12421177852299 | 6.470441187117138 | 0.0 | 1937 | 0 | 63.80303100002038 | 1377.5951729999178 |

1.40.13.dev1

What's Changed
* ui - fix team based usage crashing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4198
* [Fix + Refactor] - Router Alerting for llm exceptions + use separate util for sending alert by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4197
* [Bug fix] Don't cache team, user, customer budget after calling /update, /delete by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4203
* feat(router.py): support content policy fallbacks by krrishdholakia in https://github.com/BerriAI/litellm/pull/4207
* fix(slack_alerting.py): allow new 'alerting_metadata' arg by krrishdholakia in https://github.com/BerriAI/litellm/pull/4205
* build(pyproject.toml): require pydantic v2 by krrishdholakia in https://github.com/BerriAI/litellm/pull/4151


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.13...v1.40.13.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.13.dev1



Don't want to maintain your internal proxy? get in touch ๐ŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed โœ… | 130.0 | 150.82119762323373 | 6.385731572640492 | 0.0 | 1911 | 0 | 110.03221099997518 | 539.267988000006 |
| Aggregated | Passed โœ… | 130.0 | 150.82119762323373 | 6.385731572640492 | 0.0 | 1911 | 0 | 110.03221099997518 | 539.267988000006 |

1.40.12

Not secure
What's Changed
* add llama 3 family from deepinfra by themrzmaster in https://github.com/BerriAI/litellm/pull/4191
* feat(proxy/utils.py): allow budget duration in months (`1mo`) by krrishdholakia in https://github.com/BerriAI/litellm/pull/4188
* fix(utils.py): check if model info is for model with correct provider by krrishdholakia in https://github.com/BerriAI/litellm/pull/4186
* Retry on connection disconnect by krrishdholakia in https://github.com/BerriAI/litellm/pull/4178


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.11...v1.40.12



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.12



Don't want to maintain your internal proxy? get in touch ๐ŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed โœ… | 94 | 113.78855698077908 | 6.432303146239259 | 0.0 | 1925 | 0 | 80.02467099998967 | 1025.8250419999513 |
| Aggregated | Passed โœ… | 94 | 113.78855698077908 | 6.432303146239259 | 0.0 | 1925 | 0 | 80.02467099998967 | 1025.8250419999513 |

1.40.11

Not secure
What's Changed
* [Fix] Redacting messages from OTEL + Refactor `utils.py` to use `litellm_core_utils` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4176
* Stop throwing constant S3 spam on cache misses by Manouchehri in https://github.com/BerriAI/litellm/pull/4177
* [Feat] - Prometheus add remaining_team_budget gauge by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4179
* fix - clean up swagger spend endpoints ๐Ÿงน๐Ÿงน๐Ÿงน๐Ÿงน by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4180
* [Fix] Fix bug when updating team budgets on UI + display budget =0.0 correctly on UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4183
* Reset budget option for teams by krrishdholakia in https://github.com/BerriAI/litellm/pull/4185
* feat(__init__.py): allow setting drop_params as an env by krrishdholakia in https://github.com/BerriAI/litellm/pull/4187
* [Doc] Setting Team budgets by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4189


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.10...v1.40.11



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.11



Don't want to maintain your internal proxy? get in touch ๐ŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed โœ… | 120.0 | 140.50671503682315 | 6.351765918831661 | 0.0 | 1901 | 0 | 96.28972799998792 | 1490.2560670000184 |
| Aggregated | Passed โœ… | 120.0 | 140.50671503682315 | 6.351765918831661 | 0.0 | 1901 | 0 | 96.28972799998792 | 1490.2560670000184 |

1.40.10

Not secure
What's Changed
* [Feat] add VertexAI `vertex_ai/text-embedding-004` , `vertex_ai/text-multilingual-embedding-002 ` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4145
* Add IAM cred caching for OIDC flow by Manouchehri in https://github.com/BerriAI/litellm/pull/3712
* feat(util.py/azure.py): Add OIDC support when running LiteLLM on Azure + Azure Upstream caching by Manouchehri in https://github.com/BerriAI/litellm/pull/3861
* [Feat] Support `task_type`, `auto_truncate` params by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4152
* [Feat] support `dimensions` for vertex embeddings by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4149
* docs - run proxy on custom root path by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4154
* [Fix] `user` was inserted in Proxy Server embedding requests + added param mapping for mistral by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4156
* [Fix] Add ClarifAI support for LiteLLM Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4158
* [Admin UI] Fix error Internal Users see when using SSO by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4164
* [Fix] - Error selecting model provider from UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4166
* [UI] add Azure AI studio models on UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4167
* feat(vertex_httpx.py): Support Vertex AI system messages, JSON Schema, etc. by krrishdholakia in https://github.com/BerriAI/litellm/pull/4160
* Fix errors in the Vertex AI documentation by yamitzky in https://github.com/BerriAI/litellm/pull/4171
* feat(prometheus): add api_team_alias to exported labels by bcvanmeurs in https://github.com/BerriAI/litellm/pull/4169

New Contributors
* yamitzky made their first contribution in https://github.com/BerriAI/litellm/pull/4171

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.9...v1.40.10



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.10



Don't want to maintain your internal proxy? get in touch ๐ŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed โœ… | 140.0 | 172.37660025809805 | 6.297822628765798 | 0.0 | 1883 | 0 | 114.60945100003528 | 3651.5153230000124 |
| Aggregated | Passed โœ… | 140.0 | 172.37660025809805 | 6.297822628765798 | 0.0 | 1883 | 0 | 114.60945100003528 | 3651.5153230000124 |

v1.40.9-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.9...v1.40.9-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.9-stable



Don't want to maintain your internal proxy? get in touch ๐ŸŽ‰
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed โœ… | 120.0 | 145.47575085580996 | 6.442239890874918 | 0.0 | 1928 | 0 | 104.64309999997568 | 1708.8100789999885 |
| Aggregated | Passed โœ… | 120.0 | 145.47575085580996 | 6.442239890874918 | 0.0 | 1928 | 0 | 104.64309999997568 | 1708.8100789999885 |

Page 38 of 93

Links

Releases

Has known vulnerabilities

ยฉ 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.