Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 81 of 93

1.22.5

Not secure
What's Changed
* Re-raise exception in async ollama streaming by vanpelt in https://github.com/BerriAI/litellm/pull/1750
* Add a Helm chart for deploying LiteLLM Proxy by ShaunMaher in https://github.com/BerriAI/litellm/pull/1602
* Update Perplexity models in model_prices_and_context_window.json by toniengelhardt in https://github.com/BerriAI/litellm/pull/1826
* (feat) Add sessionId for Langfuse. by Manouchehri in https://github.com/BerriAI/litellm/pull/1828
* [Feat] Sync model_prices_and_context_window.json and litellm/model_prices_and_context_window_backup.json by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1834

New Contributors
* vanpelt made their first contribution in https://github.com/BerriAI/litellm/pull/1750

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.22.3...v1.22.5

1.22.3

Not secure
What's Changed
* feat(utils.py): support cost tracking for openai/azure image gen models by krrishdholakia in https://github.com/BerriAI/litellm/pull/1805


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.22.2...v1.22.3

1.22.2

Not secure
Admin UI 🤠
- view spend, budget for signed in user
- view daily spend, top users for a key
![ui_3](https://github.com/BerriAI/litellm/assets/29436595/e379f469-7948-475d-a7b4-4c6c1ee1b392)
What's Changed
* Litellm vertex ai gecko support by krrishdholakia in https://github.com/BerriAI/litellm/pull/1794
* [Feat] Allow setting user roles for UserTable by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1795
* fix(vertex_ai.py): add async embedding support for vertex ai by krrishdholakia in https://github.com/BerriAI/litellm/pull/1797
* [UI] Show UserID, user_role on UI by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1800
* feat(bedrock.py): add stable diffusion image generation support by krrishdholakia in https://github.com/BerriAI/litellm/pull/1799
* [UI] view role on ui by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1801
* [Feat] UI view spend / budget per user by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1803
* fix(vertex_ai.py): treat vertex ai high-traffic error as a rate limit error - allows user-controlled backoff logic to work here by krrishdholakia in https://github.com/BerriAI/litellm/pull/1802
* [UI] View Key Spend Reports 🤠 by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1807
* Support caching individual items in embedding list (Async embedding only) by krrishdholakia in https://github.com/BerriAI/litellm/pull/1809


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.21.7...v1.22.2

1.21.7

Not secure
**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.21.6...v1.21.7

1.21.6

Not secure
What's Changed
* Litellm cost tracking caching fixes (should be 0.0) by krrishdholakia in https://github.com/BerriAI/litellm/pull/1786
* [Fix] /key/delete + add delete cache keys by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1788


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.21.5...v1.21.6

1.21.5

Not secure
What's Changed
⭐️ [Feat] Show correct provider in exceptions - for Mistral API, PerplexityAPI, Anyscale, XInference by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1765, https://github.com/BerriAI/litellm/pull/1776
(Thanks dhruv-anand-aintech for the issue/help)
Exceptions for Mistral API, PerplexityAPI, Anyscale, XInference now show the correct provider name, before they would show `OPENAI_API_KEY` is missing when using PerplexityAI

shell
exception: PerplexityException - Traceback (most recent call last):
File "/Users/ishaanjaffer/Github/litellm/litellm/llms/perplexity.py", line 349, in completion
raise e
File "/Users/ishaanjaffer/Github/litellm/litellm/llms/perplexity.py", line 292, in completion
perplexity_client = perplexity(
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/perplexity/_client.py", line 98, in __init__
raise perplexityError(
perplexity.perplexityError: The api_key client option must be set either by passing api_key to the client or by setting the PERPLEXITY_API_KEY environment variable



* fix(view_key_table_tsx): show abbreviated key name instead of hashed token by krrishdholakia in https://github.com/BerriAI/litellm/pull/1782
* fix(main.py): for health checks, don't use cached responses by krrishdholakia in https://github.com/BerriAI/litellm/pull/1785


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.21.4...v1.21.5

Page 81 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.