What's Changed
* Litellm ruff linting enforcement by krrishdholakia in https://github.com/BerriAI/litellm/pull/5992
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.8...v1.48.9
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.9
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 94 | 110.6238739327856 | 6.517541026059478 | 0.0 | 1949 | 0 | 75.31776399997625 | 2736.753392999958 |
| Aggregated | Passed ✅ | 94 | 110.6238739327856 | 6.517541026059478 | 0.0 | 1949 | 0 | 75.31776399997625 | 2736.753392999958 |
v1.48.8-stable
What's Changed
* Fixed minor typo in bash command to prevent overwriting .env file by sdaoudi in https://github.com/BerriAI/litellm/pull/5902
* (docs) fix health check documentation language problems by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5984
* (docs) add example using Azure OpenAI entrata id, client_id, tenant_id with litellm by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5985
* (docs) prometheus metrics document all prometheus metrics by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5989
* [Bug] Skip slack alert if there was no spend by pazcuturi in https://github.com/BerriAI/litellm/pull/5998
* (feat proxy slack alerting) - allow opting in to getting key / internal user alerts by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5990
* (performance improvement - vertex embeddings) ~111.11% faster by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6000
New Contributors
* sdaoudi made their first contribution in https://github.com/BerriAI/litellm/pull/5902
* pazcuturi made their first contribution in https://github.com/BerriAI/litellm/pull/5998
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.48.7...v1.48.8-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.48.8-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 130.0 | 155.33913651327444 | 6.425896297578675 | 0.003345078759801497 | 1921 | 1 | 93.83325700002842 | 2569.498112999952 |
| Aggregated | Passed ✅ | 130.0 | 155.33913651327444 | 6.425896297578675 | 0.003345078759801497 | 1921 | 1 | 93.83325700002842 | 2569.498112999952 |