What's Changed
* (feat) Langfuse - Add location logging, and add cache_hit to metadata. by Manouchehri in https://github.com/BerriAI/litellm/pull/2961
* [FEAT] Add `groq/llama3` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3176
* [UI] Show teams as dropdown in invite user flow by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3180
* [FEAT] Log team alias to langfuse by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3175
* fix: Stream completion responses from anthropic. (Fix 3129) by jmandel in https://github.com/BerriAI/litellm/pull/3174
* [Fix] - Langfuse log proxy_base_url to langfuse as a tag (if set by user) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3183
* fix(caching.py): dual cache async_batch_get_cache fix + testing by krrishdholakia in https://github.com/BerriAI/litellm/pull/3179
* fix(caching.py): fix redis url parsing logic to work with ssl urls by krrishdholakia in https://github.com/BerriAI/litellm/pull/3173
![codeimage-snippet_20 (1)](https://github.com/BerriAI/litellm/assets/29436595/74de84a9-f331-4b52-869f-12d75df250c4)
New Contributors
* jmandel made their first contribution in https://github.com/BerriAI/litellm/pull/3174
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.15...v1.35.16
Don't want to maintain your internal proxy? get in touch πHosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed β
| 77 | 85.19056921350604 | 1.5329734178477303 | 0.0 | 459 | 0 | 70.44526700002507 | 1436.744482999984 |
| /health/liveliness | Passed β
| 61 | 65.12887882126974 | 15.042510400841344 | 0.0 | 4504 | 0 | 58.82374399999435 | 1409.0413549999994 |
| /health/readiness | Passed β
| 61 | 64.03151302740865 | 15.596919087906102 | 0.003339811367859979 | 4670 | 1 | 59.10709400001224 | 1196.9403539999917 |
| Aggregated | Passed β
| 61 | 65.55279843433988 | 32.172402906595174 | 0.003339811367859979 | 9633 | 1 | 58.82374399999435 | 1436.744482999984 |
v1.35.15-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.20...v1.35.15-stable
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed β
| 43 | 49.55053653586523 | 1.583453330366775 | 0.0 | 474 | 0 | 35.98067499999047 | 1311.9795399999816 |
| /health/liveliness | Passed β
| 26 | 28.40283839727455 | 15.690886693528993 | 0.0 | 4697 | 0 | 23.32956700001887 | 1245.456331000014 |
| /health/readiness | Passed β
| 26 | 29.229367997202925 | 15.527196370347617 | 0.0 | 4648 | 0 | 23.27850199998238 | 1275.4970010000193 |
| Aggregated | Passed β
| 26 | 29.814969825949255 | 32.801536394243385 | 0.0 | 9819 | 0 | 23.27850199998238 | 1311.9795399999816 |