Litellm

Latest version: v1.52.14

Safety actively analyzes 682404 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 93

2.0

Admin UI is now on the Proxy Server
- When you start the proxy you'll be able to find your admin UI link on the swagger docs
- The UI is a Static App h/t Manouchehri for this suggestion
- Doc on getting started: https://docs.litellm.ai/docs/proxy/ui
- cc bsu3338 this change impacts you - the UI is by default on the proxy server (GIF shows how to get the UI link), let me know if you have any questions
![litellm_ui_3](https://github.com/BerriAI/litellm/assets/29436595/8a8220b0-99bf-458a-8ae5-ccc233e078ef)

Admin UI uses jwts
- The UI never shows a Proxy API key in the URL param (we've move to jwts in the query params) cc Manouchehri

Admin UI - Remove'd the need for setting allow_user_auth: True if user is logged in with SSO)

* [Fix] UI - Use jwts by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1730
* [Feat] Add Admin UI on Proxy Server (Static Web App) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1726
* [Fix-UI] If user is already logged in using SSO, set allow_user_auth: True by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1728 * [Fix] UI - Use jwts by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1730

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.20.8...v1.20.9

1.52.12

What's Changed
* LiteLLM Minor Fixes & Improvements (11/19/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6820
* Add gpt-4o-2024-11-20 by Manouchehri in https://github.com/BerriAI/litellm/pull/6832
* LiteLLM Minor Fixes & Improvements (11/20/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6831
* Litellm dev 11 20 2024 by krrishdholakia in https://github.com/BerriAI/litellm/pull/6838
* (refactor) anthropic - move _process_response in transformation.py by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6834
* (feat) add usage / cost tracking for Anthropic passthrough routes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6835
* (testing) - add e2e tests for anthropic pass through endpoints by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6840
* (fix) don't block proxy startup if license check fails & using prometheus by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6839


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.11...v1.52.12



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.12



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 260.0 | 288.3101155320099 | 6.056613494123171 | 0.0 | 1812 | 0 | 231.241644000022 | 2338.7360799999897 |
| Aggregated | Passed ✅ | 260.0 | 288.3101155320099 | 6.056613494123171 | 0.0 | 1812 | 0 | 231.241644000022 | 2338.7360799999897 |

1.52.11

What's Changed
* (docs improvement) remove emojis, use `guides` section, categorize uncategorized docs by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6796
* (docs) simplify left nav names + use a section for `making llm requests` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6799
* Bump cross-spawn from 7.0.3 to 7.0.5 in /ui by dependabot in https://github.com/BerriAI/litellm/pull/6779
* Docs - use 1 page for all logging integrations on proxy + add logging features at top level by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6805
* (docs) add docstrings for all /key, /user, /team, /customer endpoints by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6804
* LiteLLM Minor Fixes & Improvements (11/15/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6746
* (Proxy) add support for DOCS_URL and REDOC_URL by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6806
* feat - add `fireworks_ai/qwen2p5-coder-32b-instruct` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6818
* Litellm stable pr 10 30 2024 by krrishdholakia in https://github.com/BerriAI/litellm/pull/6821
* (Feat) Add provider specific budget routing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6817
* (feat) provider budget routing improvements by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6827


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.10...v1.52.11



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.11



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 270.0 | 300.82403009385007 | 6.198177352347725 | 0.0 | 1854 | 0 | 229.45128300000306 | 3106.586268000001 |
| Aggregated | Failed ❌ | 270.0 | 300.82403009385007 | 6.198177352347725 | 0.0 | 1854 | 0 | 229.45128300000306 | 3106.586268000001 |

1.52.10

What's Changed
* add openrouter/qwen/qwen-2.5-coder-32b-instruct by paul-gauthier in https://github.com/BerriAI/litellm/pull/6731
* Update routing references by emmanuel-ferdman in https://github.com/BerriAI/litellm/pull/6758
* (Doc) Add section on what is stored in the DB + Add clear section on key/team based logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6769
* (Admin UI) - Remain on Current Tab when user clicks refresh by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6777
* (UI) fix - allow editing key alias on Admin UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6776
* (docs) add doc string for /key/update by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6778
* (patch) using image_urls with `vertex/anthropic` models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6775
* (fix) Azure AI Studio - using `image_url` in content with both text and image_url by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6774
* build: add gemini-exp-1114 by krrishdholakia in https://github.com/BerriAI/litellm/pull/6786
* (fix) httpx handler - bind to ipv4 for httpx handler by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6785


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.9...v1.52.10



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.10



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 240.0 | 271.7799367877801 | 6.1248828197277065 | 0.0 | 1833 | 0 | 213.09577699997817 | 2144.701510999994 |
| Aggregated | Passed ✅ | 240.0 | 271.7799367877801 | 6.1248828197277065 | 0.0 | 1833 | 0 | 213.09577699997817 | 2144.701510999994 |

1.52.9

What's Changed
* (feat) add bedrock/stability.stable-image-ultra-v1:0 by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6723
* [Feature]: Stop swallowing up AzureOpenAi exception responses in litellm's implementation for a BadRequestError by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6745
* [Feature]: json_schema in response support for Anthropic by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6748
* fix: import audio check by IamRash-7 in https://github.com/BerriAI/litellm/pull/6740
* (fix) Cost tracking for `vertex_ai/imagen3` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6752
* (feat) Vertex AI - add support for fine tuned embedding models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6749
* LiteLLM Minor Fixes & Improvements (11/13/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6729
* feat - add us.llama 3.1 models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6760
* (Feat) Add Vertex Model Garden llama 3.1 models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6763
* (fix) Fix - don't allow `viewer` roles to create virtual keys by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6764
* (feat) Use `litellm/` prefix when storing virtual keys in AWS secret manager by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6765

New Contributors
* IamRash-7 made their first contribution in https://github.com/BerriAI/litellm/pull/6740

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.8...v1.52.9



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.9



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed ❌ | 280.0 | 314.28547197285235 | 6.039371468840217 | 0.0 | 1805 | 0 | 226.56484299994872 | 2776.9337409999935 |
| Aggregated | Failed ❌ | 280.0 | 314.28547197285235 | 6.039371468840217 | 0.0 | 1805 | 0 | 226.56484299994872 | 2776.9337409999935 |

1.52.9.dev1

What's Changed
* add openrouter/qwen/qwen-2.5-coder-32b-instruct by paul-gauthier in https://github.com/BerriAI/litellm/pull/6731
* Update routing references by emmanuel-ferdman in https://github.com/BerriAI/litellm/pull/6758
* (Doc) Add section on what is stored in the DB + Add clear section on key/team based logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6769
* (Admin UI) - Remain on Current Tab when user clicks refresh by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6777
* (UI) fix - allow editing key alias on Admin UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6776
* (docs) add doc string for /key/update by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6778
* (patch) using image_urls with `vertex/anthropic` models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6775
* (fix) Azure AI Studio - using `image_url` in content with both text and image_url by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6774


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.9...v1.52.9.dev1



Docker Run LiteLLM Proxy


docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.9.dev1



Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 200.0 | 222.33484302855678 | 6.204541843497746 | 0.0033429643553328373 | 1856 | 1 | 62.294459999975516 | 2005.856768000001 |
| Aggregated | Passed ✅ | 200.0 | 222.33484302855678 | 6.204541843497746 | 0.0033429643553328373 | 1856 | 1 | 62.294459999975516 | 2005.856768000001 |

Page 1 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.