Litellm

Latest version: v1.61.11

Safety actively analyzes 707607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 103 of 110

1.18.3

Not secure
What's Changed

* [Feat] /key/generate - create keys with`team_id` by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1500
Set `team_id` when creating users, keys
docs: https://docs.litellm.ai/docs/proxy/virtual_keys#request
shell
curl 'http://0.0.0.0:8000/key/generate' \
--header 'Authorization: Bearer <your-master-key>' \
--header 'Content-Type: application/json' \
--data-raw '{
"models": ["gpt-3.5-turbo", "gpt-4", "claude-2"],
"team_id": "core-infra"
}'


* (fix) read azure ad token from optional params extra body by krrishdholakia in https://github.com/BerriAI/litellm/commit/e0aaa94f28304e362dabd39b164940834ac0fa50

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.18.2...v1.18.3

1.18.2

Not secure
What's Changed
* [Test+Fix] /Key/Info, /Key/Update - Litellm unit test key endpoints by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1496
* fix(ollama_chat.py): use tiktoken as backup for prompt token counting by puffo in https://github.com/BerriAI/litellm/pull/1495
* fix(parallel_request_limiter.py): decrement count for failed llm calls by krrishdholakia in https://github.com/BerriAI/litellm/commit/1ea3833ef7dad9d8fb6be32d724a5d81e496d358
* fix(proxy_server.py): show all models user has access to in /models by krrishdholakia in https://github.com/BerriAI/litellm/commit/c8dd36db9ec359fa174dcecdd6ed3c4ccf1d40de

New Contributors
* puffo made their first contribution in https://github.com/BerriAI/litellm/pull/1495

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.18.1...v1.18.2

1.18.1

Not secure
What's Changed
* Altered requirements.txt to require pyyaml 6.0.1 which will resolve 1488 by ShaunMaher in https://github.com/BerriAI/litellm/pull/1489
* Update s3 cache to support folder by duarteocarmo in https://github.com/BerriAI/litellm/pull/1494

New Contributors
* ShaunMaher made their first contribution in https://github.com/BerriAI/litellm/pull/1489
* duarteocarmo made their first contribution in https://github.com/BerriAI/litellm/pull/1494

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.18.0...v1.18.1

1.18.0

Not secure
What's Changed
https://docs.litellm.ai/docs/simple_proxy
* [Feat] Proxy - Access Key metadata in callbacks by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1484
- Access Proxy Key metadata in callbacks
- Access Endpoint URL in calbacks - you can see if /chat/completions, /embeddings, /image/generation etc is called
- Support for Langfuse Tags, We log request metadata as langfuse tags

PS. no keys leaked - these are keys to my local proxy
<img width="529" alt="Screenshot 2024-01-17 at 6 10 10 PM" src="https://github.com/BerriAI/litellm/assets/29436595/991744d2-2d83-49a0-bf31-c76d9d7bdaf4">

Support for model access groups
Use this if you have keys with access to specific models, and you want to give all them access to a new model.

You can now assign keys access to model groups, and add new models to that group via the config.yaml - https://docs.litellm.ai/docs/proxy/users#grant-access-to-new-model

bash
curl --location 'http://localhost:8000/key/generate' \
-H 'Authorization: Bearer <your-master-key>' \
-H 'Content-Type: application/json' \
-d '{"models": ["beta-models"], 👈 Model Access Group
"max_budget": 0,}'


Langfuse Tags logged:
<img width="949" alt="Screenshot 2024-01-17 at 6 11 36 PM" src="https://github.com/BerriAI/litellm/assets/29436595/a28af993-7414-405a-b5f3-63562caecd40">
* feat(proxy_server.py): support model access groups by krrishdholakia in https://github.com/BerriAI/litellm/pull/1483



**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.18...v1.18.0

What's Changed
* [Feat] Proxy - Access Key metadata in callbacks by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1484
* feat(proxy_server.py): support model access groups by krrishdholakia in https://github.com/BerriAI/litellm/pull/1483


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.18...v1.18.0

1.17.18

Not secure
What's Changed
* [Fix+Test] /key/delete functions by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1482 Added extensive testing + improved swagger


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.17...v1.17.18

1.17.17

Not secure
What's Changed
* [Test] Proxy - Unit Test proxy key gen by ishaan-jaff in https://github.com/BerriAI/litellm/pull/1478

Testing + fixes for: https://docs.litellm.ai/docs/proxy/virtual_keys

1. Generate a Key, and use it to make a call
2. Make a call with invalid key, expect it to fail
3. Make a call to a key with invalid model - expect to fail
4. Make a call to a key with valid model - expect to pass
5. Make a call with key over budget, expect to fail
6. Make a streaming chat/completions call with key over budget, expect to fail
7. Make a call with an key that never expires, expect to pass
8. Make a call with an expired key, expect to fail


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.17.16...v1.17.17

Page 103 of 110

Links

Releases

Has known vulnerabilities

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.