What's Changed
* add additional models from openrouter by Merlinvt in https://github.com/BerriAI/litellm/pull/3545
* Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI) by Manouchehri in https://github.com/BerriAI/litellm/pull/3507
* Fix tool calls tracking with Lunary by vincelwt in https://github.com/BerriAI/litellm/pull/3424
* ✨ feat: Add Azure Content-Safety Proxy hooks by Lunik in https://github.com/BerriAI/litellm/pull/3407
* fix(exceptions.py): import openai Exceptions by nobu007 in https://github.com/BerriAI/litellm/pull/3399
* Clarifai-LiteLLM : Added clarifai as LLM Provider. by mogith-pn in https://github.com/BerriAI/litellm/pull/3369
* (fix) Fixed linting and other bugs with watsonx provider by simonsanvil in https://github.com/BerriAI/litellm/pull/3561
* feat(router.py): allow setting model_region in litellm_params by krrishdholakia in https://github.com/BerriAI/litellm/pull/3582
* [UI] Show Token ID/Hash on Admin UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3583
* [Litellm Proxy + litellm.Router] - Pass the same message/prompt to N models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3585
* [Feat] - log metadata on traces + allow users to log metadata when `existing_trace_id` exists by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3581
* Set fake env vars for `client_no_auth` fixture by msabramo in https://github.com/BerriAI/litellm/pull/3588
* [Feat] Proxy + Router - Retry on RateLimitErrors when fallbacks, other deployments exists by ishaan-jaff in https://github.com/BerriAI/litellm/pull/3590
* Make `test_load_router_config` pass by msabramo in https://github.com/BerriAI/litellm/pull/3589
* feat(bedrock_httpx.py): Make Bedrock-Cohere calls Async + Command-R support by krrishdholakia in https://github.com/BerriAI/litellm/pull/3586
New Contributors
* Merlinvt made their first contribution in https://github.com/BerriAI/litellm/pull/3545
* mogith-pn made their first contribution in https://github.com/BerriAI/litellm/pull/3369
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.3-stable...v1.37.5
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.5
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
v1.37.3-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.37.3...v1.37.3-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.3-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat