Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 88 of 93

1.17.13

Not secure
What's Changed
Proxy Virtual Keys Improvements
* [Test] Add better testing for /key/generate, /user/new by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1465

Added Testing + minor fixes for the following scenarios:
1. Generate a Key, and use it to make a call
2. Make a call with invalid key, expect it to fail
3. Make a call to a key with invalid model - expect to fail
4. Make a call to a key with valid model - expect to pass
5. Make a call with key over budget, expect to fail
6. Make a streaming chat/completions call with key over budget, expect to fail


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.12...v1.17.13

1.17.12

Not secure
What's Changed
LiteLLM Proxy:
https://docs.litellm.ai/docs/proxy/virtual_keys
- /key/generate, user_auth There was a bug with how we were checking expiry time
- user_auth Requests Fail when a user crosses their budget
- user_auth Requests Fail when a user crosses their budget now (with streaming requests)

PRs with fixes
* [Fix] Fixes for /key/gen, /user/new by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1464
* [Fix] Proxy - Budget Tracking for Streaming Requests for /key/generate by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1466


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.10...v1.17.12

1.17.10

Not secure
What's Changed
LiteLLM Proxy:
* [FEAT] Proxy set NUM_WORKERS, HOST, PORT as env variable by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1461

Usage
shell
export NUM_WORKERS=4
litellm --config config.yaml

https://docs.litellm.ai/docs/proxy/cli


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.9...v1.17.10

1.17.9

Not secure
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.8...v1.17.9

1.17.8

Not secure
🪨 Major improvements to Bedrock, Sagemaker exception mapping, catch litellm.ContextWindowError
📖 Improved docs for litellm image generation cmungall (https://www.linkedin.com/in/ACoAAABOqa4BzkdZWMYCCB3xnNBP21KI9Ngyw_Q)
⭐️ LiteLLM Proxy - now you can reject LLM Responses if they violate your policies: https://docs.litellm.ai/docs/proxy/rules
🛠️Added testing for LiteLLM Proxy exception mapping, Sagemaker + Bedrock exception mapping for ContextWindowError

1.17.7

Not secure
What's Changed
* [Feat] Litellm Proxy improve exception mapping by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1449
All Exceptions from the LiteLLM Proxy return the following format - this is exactly the OpenAI API Exception format
json
{
"error": {
"message": "'[{role: user, content: hi}]' is not of type 'array' - 'messages'",
"type": "invalid_request_error",
"param": null,
"code": null
}
}


* Fix minor typos in image_generation.md by cmungall in https://github.com/BerriAI/litellm/pull/1450

New Contributors
* cmungall made their first contribution in https://github.com/BerriAI/litellm/pull/1450

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.6...v1.17.7

Page 88 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.