Codestral API : https://docs.litellm.ai/docs/providers/codestral
<img width="428" alt="Xnapper-2024-06-17-22 54 32" src="https://github.com/BerriAI/litellm/assets/29436595/481ec4ec-e247-426b-95e1-83e0fad1e579">
What's Changed
* Langfuse Integration ignore Embedding Output by hburrichter in https://github.com/BerriAI/litellm/pull/4226
* [Refactor Proxy] - refactor proxy place internal user, customer endpoints in separate file by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4223
* Add Gemini context window pricing by krrishdholakia in https://github.com/BerriAI/litellm/pull/4243
* Update utils.py (fix dangerous code) by CodeVigilanteOfficial in https://github.com/BerriAI/litellm/pull/4228
* fix: lunary callback tags by hughcrt in https://github.com/BerriAI/litellm/pull/4141
* build(deps): bump urllib3 from 2.2.1 to 2.2.2 by dependabot in https://github.com/BerriAI/litellm/pull/4251
* build(deps): bump ws from 7.5.9 to 7.5.10 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/4252
* VertexAI/Gemini: Calculate cost based on context window by krrishdholakia in https://github.com/BerriAI/litellm/pull/4245
* [Feat] Add Codestral FIM API by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4247
* [Feat] - Add Codestral Chat API by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4242
* build(deps): bump sharp from 0.30.7 to 0.32.6 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/4253
* [SECURITY] `/model/info`: strip out llm credential before returning by ushuz in https://github.com/BerriAI/litellm/pull/4244
* [New Feature] Add mock_tool_calls to `main.py` by jacquesyvesgl in https://github.com/BerriAI/litellm/pull/4195
* Fix file type handling of uppercase extensions by nick-rackauckas in https://github.com/BerriAI/litellm/pull/4182
* [Fix] Refactor Logfire to use LiteLLM OTEL Class by ishaan-jaff in https://github.com/BerriAI/litellm/pull/4254
* feat(main.py): Gemini (Google AI Studio) - Support Function Calling, Inline images, etc. by krrishdholakia in https://github.com/BerriAI/litellm/pull/4246
New Contributors
* CodeVigilanteOfficial made their first contribution in https://github.com/BerriAI/litellm/pull/4228
* hughcrt made their first contribution in https://github.com/BerriAI/litellm/pull/4141
* jacquesyvesgl made their first contribution in https://github.com/BerriAI/litellm/pull/4195
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.40.15...v1.40.16
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.16
Don't want to maintain your internal proxy? get in touch π
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed β
| 120.0 | 147.66809414867745 | 6.318096816836798 | 0.0 | 1890 | 0 | 102.75308299998187 | 1718.0775130000256 |
| Aggregated | Passed β
| 120.0 | 147.66809414867745 | 6.318096816836798 | 0.0 | 1890 | 0 | 102.75308299998187 | 1718.0775130000256 |