**Full Changelog**: https://github.com/BerriAI/litellm/compare/1.35.5.dev2...v1.35.5
Call 100+ LLMS, run /health checks on Admin UI
👉 Edit + Test langfuse and SlackHQ configurations on the LiteLLM UI
🛠️ UI - fix - adding azure OpenAI on admin ui
⚡️ [Fix] Load proxy models when proxy starts up
✅ [LiteLLM UI] Show Error message for 10-20s (h/t Graham Neubig for this request)
😇 QA - Added Tests for /health endpoints on Proxy
<img width="623" alt="Screenshot 2024-04-13 at 10 12 53 PM" src="https://github.com/BerriAI/litellm/assets/29436595/536719af-cd08-4a7c-bc97-6127f23f8666">
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 92 | 97.28560691555401 | 1.5031977156223721 | 0.0 | 450 | 0 | 86.17738300006295 | 926.6810239999472 |
| /health/liveliness | Passed ✅ | 78 | 80.18245176222777 | 15.299212305667698 | 0.003340439368049716 | 4580 | 1 | 74.2219969999951 | 1033.391943999959 |
| /health/readiness | Passed ✅ | 78 | 80.9458593949494 | 15.342638017452344 | 0.0 | 4593 | 0 | 74.08499699999993 | 1307.7433869999595 |
| Aggregated | Passed ✅ | 78 | 81.34661585617866 | 32.14504803874242 | 0.003340439368049716 | 9623 | 1 | 74.08499699999993 | 1307.7433869999595 |