Litellm

Latest version: v1.61.11

Safety actively analyzes 707607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 106 of 110

1.17.3

Not secure
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.2...v1.17.3

Supports health checks for text completion models


model_list:
- model_name: azure-text-completion
litellm_params:
model: azure/text-davinci-003
api_base: os.environ/AZURE_API_BASE
api_key: os.environ/AZURE_API_KEY
api_version: "2023-07-01-preview"
model_info:
mode: completion 👈 ADD THIS

1.17.2

Not secure
What's Changed
* [Feat] Improve LiteLLM Verbose Logs - show args passed to litellm function by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1419
* (fix) create httpx.Request instead of httpx.request by dleen in https://github.com/BerriAI/litellm/pull/1422
* Add explicit dependency on requests library by dleen in https://github.com/BerriAI/litellm/pull/1421
* [Fix] Bedrock embeddings - support str `input` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1423

New Contributors
* dleen made their first contribution in https://github.com/BerriAI/litellm/pull/1422

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.1...v1.17.2

1.17.1

Not secure
What's Changed
LiteLLM Proxy - Log Responses to s3 Buckets
Docs on setting up s3 logging on LiteLLM Proxy: https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---s3-buckets
Manouchehri

* LiteLLM Proxy Add s3 Logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1408
* feat: added explicit args to acomplete by MateoCamara in https://github.com/BerriAI/litellm/pull/1200
* fix(router.py): bump httpx pool limits by krrishdholakia in https://github.com/BerriAI/litellm/pull/1415
* [Feat] Proxy - Log Cache Hits on success callbacks + Testing by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1413

New Contributors
* MateoCamara made their first contribution in https://github.com/BerriAI/litellm/pull/1200

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.0...v1.17.1

1.17.0

Not secure
Major Changes
- LiteLLM Proxy now uses Gunicorn by default
If you use LiteLLM Dockerfile or images no changes required. If you're using litellm pip package just run `pip install 'litellm[proxy]'-U`
- Support for `litellm.ContentPolicyViolationError` catch these errors from image generation models

LiteLLM Proxy Dockerfiles
- Dockerfile.Database has `litellm` as the entrypoint https://github.com/BerriAI/litellm/blob/b103ca3960a8c42de09dd8c9ecfdf379bf298bba/Dockerfile.database#L59 cc Manouchehri (you can pass litellm cli args)
- Use https://github.com/BerriAI/litellm/pkgs/container/litellm for calling LLM APIs (without Virtual keys)
- Use https://github.com/BerriAI/litellm/pkgs/container/litellm-database for calling LLM APIs + Virtual Keys (this build has optimized cold boot for using Prisma (the DB Provider))


What's Changed
* [Feat] Add litellm.ContentPolicyViolationError by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1381
* (fix): Self-serve UI, AUTH link generation error by speedyankur in https://github.com/BerriAI/litellm/pull/1385
* (fix): Self-serve UI, AUTH link generation error by speedyankur in https://github.com/BerriAI/litellm/pull/1386
* (fix): Self-serve UI, AUTH link generation error by speedyankur in https://github.com/BerriAI/litellm/pull/1391
* (caching) Fix incorrect usage of str, which created invalid JSON. by Manouchehri in https://github.com/BerriAI/litellm/pull/1390
* Litellm dockerfile testing by krrishdholakia in https://github.com/BerriAI/litellm/pull/1402
* fix(lowest_latency.py): add back tpm/rpm checks, configurable time window support, improved latency tracking by krrishdholakia in https://github.com/BerriAI/litellm/pull/1403
* LiteLLM Proxy - Use Gunicorn with Uvicorn workers by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1399


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.16.21...v1.17.0

1.16.21

Not secure
What's Changed
* [Test+Fix] Use deployed proxy with Prisma by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1376 cc Manouchehri fixed here + we added testing against a deployed litellm proxy to our ci/cd
* build(deps): bump follow-redirects from 1.15.2 to 1.15.4 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/1380


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.16.20...v1.16.21

1.16.20

Not secure
What's Changed
* [Feat] Improve Proxy Logging by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1356
<img width="836" alt="294830582-2cf0d57f-fc24-4680-91a6-eb5c6f5850cc" src="https://github.com/BerriAI/litellm/assets/29436595/4573daa8-9814-4de3-b90b-8a665b805bd7">
* Control Proxy Debug Level using env variables https://docs.litellm.ai/docs/proxy/quick_start#debugging-proxy)


* fix(proxy_server.py): add support for passing in config file via worker_config directly + testing by krrishdholakia in https://github.com/BerriAI/litellm/pull/1367
* Update deepinfra models by ichernev in https://github.com/BerriAI/litellm/pull/1368
* Updated Gemini AI Documentation by haseeb-heaven in https://github.com/BerriAI/litellm/pull/1370
* feat(lowest_latency.py): support expanded time window for latency based routing by krrishdholakia in https://github.com/BerriAI/litellm/pull/1369

New Contributors
* haseeb-heaven made their first contribution in https://github.com/BerriAI/litellm/pull/1370

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.16.19...v1.16.20

v1.16-test2
What's Changed
* [Feat] Add litellm.ContentPolicyViolationError by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1381
* (fix): Self-serve UI, AUTH link generation error by speedyankur in https://github.com/BerriAI/litellm/pull/1385


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.16.21...v1.16-test2

Page 106 of 110

Links

Releases

Has known vulnerabilities

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.