Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 40 of 93

1.40.5

Not secure
What's Changed
* Table format fix and Typo by SujanShilakar in https://github.com/BerriAI/litellm/pull/4037
* feat: add langfuse metadata via proxy request headers by ndrsfel in https://github.com/BerriAI/litellm/pull/3990
* Add Ollama as a provider in proxy ui by sha-ahammed in https://github.com/BerriAI/litellm/pull/4020
* modified docs proxy->logging->langfuse by syGOAT in https://github.com/BerriAI/litellm/pull/4035
* fix tool usage null content using vertexai by themrzmaster in https://github.com/BerriAI/litellm/pull/4039
* Fixed openai token counter bug by Raymond1415926 in https://github.com/BerriAI/litellm/pull/4036
* feat(router.py): enable settting 'order' for a deployment in model list by krrishdholakia in https://github.com/BerriAI/litellm/pull/4046
* docs: add llmcord.py to projects by jakobdylanc in https://github.com/BerriAI/litellm/pull/4060
* Fix log message in Custom Callbacks doc by iwamot in https://github.com/BerriAI/litellm/pull/4061
* refactor: replace 'traceback.print_exc()' with logging library by krrishdholakia in https://github.com/BerriAI/litellm/pull/4049
* feat(aws_secret_manager.py): Support AWS KMS for Master Key encrption by krrishdholakia in https://github.com/BerriAI/litellm/pull/4054
* [Feat] Enterprise - Enforce Params in request to LiteLLM Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4043
* feat - OTEL set custom service names and custom tracer names by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4048

New Contributors
* ndrsfel made their first contribution in https://github.com/BerriAI/litellm/pull/3990
* sha-ahammed made their first contribution in https://github.com/BerriAI/litellm/pull/4020
* syGOAT made their first contribution in https://github.com/BerriAI/litellm/pull/4035
* Raymond1415926 made their first contribution in https://github.com/BerriAI/litellm/pull/4036
* jakobdylanc made their first contribution in https://github.com/BerriAI/litellm/pull/4060
* iwamot made their first contribution in https://github.com/BerriAI/litellm/pull/4061

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.4...v1.40.5



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.5



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 98 | 123.75303621190369 | 6.512790176735744 | 0.0 | 1949 | 0 | 80.83186400000386 | 1991.117886999973 |
| Aggregated | Passed ✅ | 98 | 123.75303621190369 | 6.512790176735744 | 0.0 | 1949 | 0 | 80.83186400000386 | 1991.117886999973 |

1.40.4

Not secure
What's Changed
* feat: clarify slack alerting message by nibalizer in https://github.com/BerriAI/litellm/pull/4023
* [Admin UI] Analytics - fix div by 0 error on /model/metrics by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4021
* Use DEBUG level for curl command logging by grav in https://github.com/BerriAI/litellm/pull/2980
* feat(create_user_button.tsx): allow admin to invite user to proxy via user-email/pwd invite-links by krrishdholakia in https://github.com/BerriAI/litellm/pull/4028
* [FIX] Proxy redirect to `PROXY_BASE_URL/ui` after logging in by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4027
* [Feat] Audit Logs for Key, User, ProxyModel CRUD operations by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4030

New Contributors
* nibalizer made their first contribution in https://github.com/BerriAI/litellm/pull/4023

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.3...v1.40.4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat





Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 74 | 89.43947919222931 | 6.450062450815326 | 0.0 | 1930 | 0 | 64.37952199996744 | 1143.0389689999743 |
| Aggregated | Passed ✅ | 74 | 89.43947919222931 | 6.450062450815326 | 0.0 | 1930 | 0 | 64.37952199996744 | 1143.0389689999743 |

v1.40.3-stable
What's Changed
* feat: clarify slack alerting message by nibalizer in https://github.com/BerriAI/litellm/pull/4023

New Contributors
* nibalizer made their first contribution in https://github.com/BerriAI/litellm/pull/4023

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.3...v1.40.3-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.3-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 166.81647102860174 | 6.3100225495221665 | 0.0 | 1888 | 0 | 109.54055500008053 | 2288.330084999984 |
| Aggregated | Passed ✅ | 140.0 | 166.81647102860174 | 6.3100225495221665 | 0.0 | 1888 | 0 | 109.54055500008053 | 2288.330084999984 |

1.40.3

Not secure
What's Changed
* [FIX] Proxy - only log cache credentials in debug mode by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4024


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.2...v1.40.3



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.3



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 168.35103872813087 | 6.385058663866248 | 0.0 | 1909 | 0 | 109.50845100001061 | 8353.559378 |
| Aggregated | Passed ✅ | 130.0 | 168.35103872813087 | 6.385058663866248 | 0.0 | 1909 | 0 | 109.50845100001061 | 8353.559378 |

v1.40.2-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.1.dev4...v1.40.2-stable



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.2-stable



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 100.0 | 135.25610868094057 | 6.399866394760457 | 0.0 | 1915 | 0 | 82.61822200000779 | 2219.8920350000435 |
| Aggregated | Passed ✅ | 100.0 | 135.25610868094057 | 6.399866394760457 | 0.0 | 1915 | 0 | 82.61822200000779 | 2219.8920350000435 |

1.40.3.dev4

What's Changed
* fix(main.py): log hidden params for text completion calls by krrishdholakia in https://github.com/BerriAI/litellm/pull/5061
* feat(proxy_cli.py): support iam-based auth to rds by krrishdholakia in https://github.com/BerriAI/litellm/pull/5057
* add OpenAI gpt-4o-2024-08-06 by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5070
* [Fix-Proxy] allow forwarding headers from request by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5062
* feat(proxy_server.py): allow restricting allowed email domains for UI by krrishdholakia in https://github.com/BerriAI/litellm/pull/5071
* [Fix] Fix testing emails through Admin UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5077
* [Feat] /audio/transcription use file checksum for cache key by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5075


**Full Changelog**: https://github.com/BerriAI/litellm/compare/1.40.3.dev2...v1.40.3.dev4



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.3.dev4



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 157.33571723020768 | 6.420250360279191 | 0.0 | 1920 | 0 | 111.65160899997773 | 1454.6697590000122 |
| Aggregated | Passed ✅ | 140.0 | 157.33571723020768 | 6.420250360279191 | 0.0 | 1920 | 0 | 111.65160899997773 | 1454.6697590000122 |

1.40.3.dev2

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.43.0.dev1...1.40.3.dev2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-1.40.3.dev2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 120.0 | 142.59505546791863 | 6.408804591996749 | 0.0 | 1917 | 0 | 97.07074900001089 | 1960.4952580000372 |
| Aggregated | Passed ✅ | 120.0 | 142.59505546791863 | 6.408804591996749 | 0.0 | 1917 | 0 | 97.07074900001089 | 1960.4952580000372 |

1.40.2

Not secure
What's Changed
* Add simple OpenTelemetry tracer by yujonglee in https://github.com/BerriAI/litellm/pull/3974
* [FEAT] Add native OTEL logging to LiteLLM by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4010
* [Docs] Use OTEL logging on LiteLLM Proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4011
* fix(bedrock): raise nested error response by pharindoko in https://github.com/BerriAI/litellm/pull/3989
* [Feat] Admin UI - Add, Edit all LiteLLM callbacks on UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4014
* feat(assistants/main.py): add assistants api streaming support by krrishdholakia in https://github.com/BerriAI/litellm/pull/4012
* feat(utils.py): Support `stream_options` param across all providers by krrishdholakia in https://github.com/BerriAI/litellm/pull/4015
* fix(utils.py): fix cost calculation for openai-compatible streaming object by krrishdholakia in https://github.com/BerriAI/litellm/pull/4009
* [Fix] Admin UI Internal Users by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4016


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.1...v1.40.2



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.2



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 72 | 86.0339053382131 | 6.392727588765549 | 0.0 | 1913 | 0 | 61.2748209999836 | 896.4834699999642 |
| Aggregated | Passed ✅ | 72 | 86.0339053382131 | 6.392727588765549 | 0.0 | 1913 | 0 | 61.2748209999836 | 896.4834699999642 |

Page 40 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.