🚨 Relevant Changes:
- LiteLLM Proxy Virtual Keys: Unique Key Aliases will be enforced on /key/generate and /key/update requests
- datadog integration will use [StandardLoggingPayload](https://docs.litellm.ai/docs/proxy/logging#what-gets-logged) (from LiteLLM v1.53.0+) & also supports logging failures https://github.com/BerriAI/litellm/pull/6929
If you need to use the v1 of the payload (not recommended), you can set this in your config
yaml
litellm_settings:
datadog_use_v1: True
Benefits of using StandardLoggingPayload for datadog
- It's a standard logging object so should be consistent over time across our logging integrations
- Added support for logging LLM failures
- Has additional info like cache_hit , request_tags etc. Full payload is here https://docs.litellm.ai/docs/proxy/logging#what-gets-logged
What's Changed
* LiteLLM Minor Fixes & Improvements (11/24/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6890
* (feat) pass through llm endpoints - add `PATCH` support (vertex context caching requires for update ops) by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6924
* sonnet supports pdf, haiku does not by paul-gauthier in https://github.com/BerriAI/litellm/pull/6928
* (feat) DataDog Logger - Add Failure logging + use Standard Logging payload by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6929
* (feat) log proxy auth errors on datadog by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6931
* (feat) Allow using include to include external YAML files in a config.yaml by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6922
* (feat) dd logger - set tags according to the values set by those env vars by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6933
* LiteLLM Minor Fixes & Improvements (11/26/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6913
* LiteLLM Minor Fixes & Improvements (11/27/2024) by krrishdholakia in https://github.com/BerriAI/litellm/pull/6943
* Update Argilla integration documentation by sdiazlor in https://github.com/BerriAI/litellm/pull/6923
* (bug fix) /key/update was not storing `budget_duration` in the DB by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6941
* (fix) handle json decode errors for DD exception logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6934
* (docs + fix) Add docs on Moderations endpoint, Text Completion by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6947
* (feat) add enforcement for unique key aliases on /key/update and /key/generate by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6944
* (fix) tag merging / aggregation logic by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6932
* (feat) Allow disabling ErrorLogs written to the DB by ishaan-jaff in https://github.com/BerriAI/litellm/pull/6940
New Contributors
* sdiazlor made their first contribution in https://github.com/BerriAI/litellm/pull/6923
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.16...v1.53.1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 200.0 | 215.7709455547284 | 6.292082946554957 | 0.0 | 1882 | 0 | 178.3981389999667 | 2851.1550680000255 |
| Aggregated | Passed ✅ | 200.0 | 215.7709455547284 | 6.292082946554957 | 0.0 | 1882 | 0 | 178.3981389999667 | 2851.1550680000255 |
v1.52.15.staging1
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.15...v1.52.15.staging1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.52.15.staging1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 250.0 | 278.6646185965574 | 6.211416620977041 | 0.0033412676820747935 | 1859 | 1 | 217.41687699994827 | 3149.612769999976 |
| Aggregated | Passed ✅ | 250.0 | 278.6646185965574 | 6.211416620977041 | 0.0033412676820747935 | 1859 | 1 | 217.41687699994827 | 3149.612769999976 |
v1.52.15-stable
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.52.15...v1.52.15-stable
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_nov27-stable
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
| Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
| /chat/completions | Passed ✅ | 200.0 | 221.52349298020383 | 6.244722043862887 | 0.0 | 1869 | 0 | 181.6640519999737 | 2200.3593760000513 |
| Aggregated | Passed ✅ | 200.0 | 221.52349298020383 | 6.244722043862887 | 0.0 | 1869 | 0 | 181.6640519999737 | 2200.3593760000513 |