What's Changed
* Add azure/gpt-4o-2024-08-06 pricing. by Manouchehri in https://github.com/BerriAI/litellm/pull/5510
* [Fix] get_llm_provider, return provider as `cohere_chat` for cohere chat models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5511
* fix proxy server - always read redis for rate limiting logic by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5484
* Helicone stream response logging by maamalama in https://github.com/BerriAI/litellm/pull/5516
* security - Prevent sql injection in `/team/update` query by krrishdholakia in https://github.com/BerriAI/litellm/pull/5513
* [Fix-Refactor] support presidio on new guardrails config by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5514
* [Fix - Proxy] show error from /spend/tags and /spend/logs on client side by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5520
* [Feat] log request / response on pass through endpoints by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5518
* [Fix-Proxy] show more descriptive error messages on /health checks by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5521
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.44.16...v1.44.17
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.44.17
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 110.0 | 144.75345275454978 | 6.424739460241986 | 0.0 | 1923 | 0 | 86.76964199997883 | 3621.227346000012 |
| Aggregated | Passed ✅ | 110.0 | 144.75345275454978 | 6.424739460241986 | 0.0 | 1923 | 0 | 86.76964199997883 | 3621.227346000012 |
v1.44.16-stable
What's Changed
* update canary by yujonglee in https://github.com/BerriAI/litellm/pull/5459
* Bump pagefind from 1.1.0 to 1.1.1 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/5491
* [Feat] Add Google Secret Manager Support by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5489
* [Feat-Proxy] Enterprise - allow controlling allowed private, public, admin only routes by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5494
* [Feat-Proxy] bump langfuse sdk version on docker by ishaan-jaff in https://github.com/BerriAI/litellm/pull/5497
* LiteLLM Minor fixes + improvements (08/03/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/5488
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.44.15...v1.44.16-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.44.16-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 140.0 | 168.79004244876447 | 6.3642503777567425 | 0.0 | 1903 | 0 | 113.65976999996974 | 2153.1978849999405 |
| Aggregated | Passed ✅ | 140.0 | 168.79004244876447 | 6.3642503777567425 | 0.0 | 1903 | 0 | 113.65976999996974 | 2153.1978849999405 |