What's Changed
* Support for Greenscale AI logging by greenscale-nandesh in https://github.com/BerriAI/litellm/pull/3098
* [UI] V0 - Edit Model tpm, rpm, api_base by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3277
* [Feat] Show model, api base in APITimeoutError exceptions by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3279
* [UI-Polish] filter by models on dropdown by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3280
* fix(proxy_server.py): fix `/config/update` by krrishdholakia in https://github.com/BerriAI/litellm/pull/3282
New Contributors
* greenscale-nandesh made their first contribution in https://github.com/BerriAI/litellm/pull/3098
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.35.24...1.35.24.dev6
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 94 | 104.27768922222346 | 1.4429443368713684 | 0.0 | 432 | 0 | 86.3644869999689 | 763.5629149999659 |
| /health/liveliness | Passed ✅ | 78 | 80.55345714471746 | 15.394746408889207 | 0.0 | 4609 | 0 | 73.8546030000009 | 696.1782949999815 |
| /health/readiness | Passed ✅ | 78 | 82.0756289784257 | 15.481590281015723 | 0.0033401489279429823 | 4635 | 1 | 74.05194300000062 | 1385.1709909999954 |
| Aggregated | Passed ✅ | 78 | 82.34181335665636 | 32.319281026776295 | 0.0033401489279429823 | 9676 | 1 | 73.8546030000009 | 1385.1709909999954 |