[BETA] Thrilled to launch support for Cohere/Command-R on LiteLLM , LiteLLM Proxy Server 👉 Start here https://docs.litellm.ai/docs/providers/cohere
☎️ PR for using cohere tool calling in OpenAI format: https://github.com/BerriAI/litellm/pull/2479
⚡️ LiteLLM Proxy + langfuse - High Traffic - support 80+/Requests per second with Proxy + Langfuse logging https://docs.litellm.ai/docs/proxy/logging
⚡️ New Models - Azure GPT-Instruct models https://docs.litellm.ai/docs/providers/azure#azure-instruct-models
🛠️ Fix for using DynamoDB + LiteLLM Virtual Keys
What's Changed
* (feat) support azure/gpt-instruct models by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2471
* [New-Model] Cohere/command-r by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2474
* (fix) patch dynamoDB team_model_alias bug by ishaan-jaff in https://github.com/BerriAI/litellm/pull/2478
* fix(azure.py): support cost tracking for azure/dall-e-3 by krrishdholakia in https://github.com/BerriAI/litellm/pull/2475
* fix(openai.py): return model name with custom llm provider for openai-compatible endpoints (e.g. mistral, together ai, etc.) by krrishdholakia in https://github.com/BerriAI/litellm/pull/2473
![Group 5750](https://github.com/BerriAI/litellm/assets/29436595/c669b0e9-76a3-4cdd-885f-8489b80db122)
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.30.2...v1.31.4