What's Changed
* docs: update README.md API key and model example typos by colesmcintosh in https://github.com/BerriAI/litellm/pull/8590
* Fix typo in main readme by scosman in https://github.com/BerriAI/litellm/pull/8574
* (UI) Allow adding models for a Team by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8598
* feat(ui): alert when adding model without STORE_MODEL_IN_DB by Aditya8840 in https://github.com/BerriAI/litellm/pull/8591
* Revert "(UI) Allow adding models for a Team" by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8600
* Litellm stable UI 02 17 2025 p1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/8599
* (UI) Allow adding models for a Team (8598) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8601
* (UI) Refactor Add Models for Specific Teams by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8592
* (UI) Improvements to Add Team Model Flow by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8603
New Contributors
* colesmcintosh made their first contribution in https://github.com/BerriAI/litellm/pull/8590
* scosman made their first contribution in https://github.com/BerriAI/litellm/pull/8574
* Aditya8840 made their first contribution in https://github.com/BerriAI/litellm/pull/8591
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.61.6-nightly...v1.61.6.dev1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.6.dev1
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed β | 170.0 | 197.04136517618934 | 6.316924319787487 | 6.316924319787487 | 1890 | 1890 | 142.7094059999945 | 2646.323271999961 |
| Aggregated | Failed β | 170.0 | 197.04136517618934 | 6.316924319787487 | 6.316924319787487 | 1890 | 1890 | 142.7094059999945 | 2646.323271999961 |
v1.61.6-nightly
What's Changed
* refactor(teams.tsx): refactor to display all teams, across all orgs by krrishdholakia in https://github.com/BerriAI/litellm/pull/8565
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.61.5-nightly...v1.61.6-nightly
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.6-nightly
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed β | 170.0 | 197.37858561234376 | 6.172709160882249 | 6.172709160882249 | 1847 | 1847 | 139.8097940000298 | 3194.1706680000266 |
| Aggregated | Failed β | 170.0 | 197.37858561234376 | 6.172709160882249 | 6.172709160882249 | 1847 | 1847 | 139.8097940000298 | 3194.1706680000266 |
v1.61.5-nightly
What's Changed
* Optimize Alpine Dockerfile by removing redundant apk commands by PeterDaveHello in https://github.com/BerriAI/litellm/pull/5016
* fix(main.py): fix key leak error when unknown provider given by krrishdholakia in https://github.com/BerriAI/litellm/pull/8556
* (Feat) - return `x-litellm-attempted-fallbacks` in responses from litellm proxy by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8558
* Add remaining org CRUD endpoints + support deleting orgs on UI by krrishdholakia in https://github.com/BerriAI/litellm/pull/8561
* Enable update/delete org members on UI by krrishdholakia in https://github.com/BerriAI/litellm/pull/8560
* (Bug Fix) - Add Regenerate Key on Virtual Keys Tab by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8567
* (Bug Fix + Better Observability) - BudgetResetJob: for reseting key, team, user budgets by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8562
* (Patch/bug fix) - UI, filter out litellm ui session tokens on Virtual Keys Page by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8568
New Contributors
* PeterDaveHello made their first contribution in https://github.com/BerriAI/litellm/pull/5016
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.61.3.dev1...v1.61.5-nightly
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.5-nightly
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed β | 150.0 | 169.92952748954406 | 6.233287189548679 | 6.233287189548679 | 1865 | 1865 | 130.2254270000276 | 1515.568768999998 |
| Aggregated | Failed β | 150.0 | 169.92952748954406 | 6.233287189548679 | 6.233287189548679 | 1865 | 1865 | 130.2254270000276 | 1515.568768999998 |
v1.61.3-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.61.3...v1.61.3-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm_stable_release_branch-v1.61.3-stable
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed β | 150.0 | 178.8079934268893 | 6.237874881624696 | 6.237874881624696 | 1867 | 1867 | 130.88419100000692 | 2746.1132829999997 |
| Aggregated | Failed β | 150.0 | 178.8079934268893 | 6.237874881624696 | 6.237874881624696 | 1867 | 1867 | 130.88419100000692 | 2746.1132829999997 |
v1.61.4-nightly
What's Changed
* docs(perplexity.md): removing `return_citations` documentation by miraclebakelaser in https://github.com/BerriAI/litellm/pull/8527
* (docs - cookbook) litellm proxy x langfuse by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8541
* UI Fixes and Improvements (02/14/2025) p1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/8546
* (Feat) - Add `/bedrock/meta.llama3-3-70b-instruct-v1:0` tool calling support + cost tracking + base llm unit test for tool calling by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8545
* fix(general_settings.tsx): filter out empty dictionaries post fallbac⦠by krrishdholakia in https://github.com/BerriAI/litellm/pull/8550
* (perf) Fix memory leak on `/completions` route by ishaan-jaff in https://github.com/BerriAI/litellm/pull/8551
* Org Flow Improvements by krrishdholakia in https://github.com/BerriAI/litellm/pull/8549
* feat(openai/o_series_transformation.py): support native streaming for o1 by krrishdholakia in https://github.com/BerriAI/litellm/pull/8552
* fix(team_endpoints.py): fix team info check to handle team keys by krrishdholakia in https://github.com/BerriAI/litellm/pull/8529
* build: ui build update by krrishdholakia in https://github.com/BerriAI/litellm/pull/8553
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.61.3...v1.61.4-nightly
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.4-nightly
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Failed β | 190.0 | 216.89425311206062 | 6.2617791082055785 | 6.2617791082055785 | 1874 | 1874 | 143.52555700003222 | 3508.21726800001 |
| Aggregated | Failed β | 190.0 | 216.89425311206062 | 6.2617791082055785 | 6.2617791082055785 | 1874 | 1874 | 143.52555700003222 | 3508.21726800001 |