Litellm

Latest version: v1.52.14

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 92 of 93

1.15.0

Not secure
What's Changed
LiteLLM Proxy now maps exceptions for 100+ LLMs to the OpenAI format https://docs.litellm.ai/docs/proxy/quick_start
🧨 Log all LLM Input/Output to [dynamodb](https://twitter.com/dynamodb) set `litellm.success_callback = ["dynamodb"] `https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---dynamodb
⭐️ Support for [MistralAI](https://twitter.com/MistralAI) API, Gemini PRO
🔎 Set Aliases for model groups on LiteLLM Proxy
🔎 Exception mapping for openai.NotFoundError live now + testing for exception mapping on proxy added to LiteLLM ci/cd https://docs.litellm.ai/docs/exception_mapping
⚙️ Fixes for async + streaming caching https://docs.litellm.ai/docs/proxy/caching
👉 Support for using Async logging with [langfuse](https://twitter.com/langfuse) live on proxy

AI Generated Release Notes
* Enable setting default `model` value for `LiteLLM`, `Chat`, `Completions` by estill01 in https://github.com/BerriAI/litellm/pull/985
* fix replicate system prompt: forgot to add **optional_params to input data by nbaldwin98 in https://github.com/BerriAI/litellm/pull/1080
* Update factory.py to fix issue when calling from write-the -> langchain -> litellm served ollama by James4Ever0 in https://github.com/BerriAI/litellm/pull/1054
* Update Dockerfile to preinstall Prisma CLI by Manouchehri in https://github.com/BerriAI/litellm/pull/1039
* build(deps): bump aiohttp from 3.8.6 to 3.9.0 by dependabot in https://github.com/BerriAI/litellm/pull/937
* multistage docker build by wallies in https://github.com/BerriAI/litellm/pull/995
* fix: traceloop links by nirga in https://github.com/BerriAI/litellm/pull/1123
* refactor: add CustomStreamWrapper return type for completion by Undertone0809 in https://github.com/BerriAI/litellm/pull/1112
* fix langfuse tests by maxdeichmann in https://github.com/BerriAI/litellm/pull/1097
* Fix 1119, no content when streaming. by emsi in https://github.com/BerriAI/litellm/pull/1122
* docs(projects): add Docq to 'projects built on..' section by janaka in https://github.com/BerriAI/litellm/pull/1142
* docs(projects): add Docq.AI to sidebar nav by janaka in https://github.com/BerriAI/litellm/pull/1143

New Contributors
* James4Ever0 made their first contribution in https://github.com/BerriAI/litellm/pull/1054
* wallies made their first contribution in https://github.com/BerriAI/litellm/pull/995
* maxdeichmann made their first contribution in https://github.com/BerriAI/litellm/pull/1097
* emsi made their first contribution in https://github.com/BerriAI/litellm/pull/1122
* janaka made their first contribution in https://github.com/BerriAI/litellm/pull/1142

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.11.1...v1.15.0

1.11.1

Not secure
Proxy
- Bug fix for non OpenAI LLMs on proxy
- Major stability improvements & Fixes + added test cases for proxy
- Async success/failure loggers
- Support for using custom loggers with `aembedding()`


What's Changed
* feat: add docker compose file and running guide by geekyayush in https://github.com/BerriAI/litellm/pull/993
* (feat) Speedup health endpoint by PSU3D0 in https://github.com/BerriAI/litellm/pull/1023
* (pricing) Add Claude v2.1 for Bedrock by Manouchehri in https://github.com/BerriAI/litellm/pull/1042

New Contributors
* geekyayush made their first contribution in https://github.com/BerriAI/litellm/pull/993

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.10.4...v1.11.1

1.10.4

Not secure
Note: Proxy Server on 1.10.4 has a bug for non OpenAI LLMs - Fixed on 1.10.11

Updates Proxy Server
- Use custom callbacks on the proxy https://docs.litellm.ai/docs/proxy/logging
- Set `timeout` and `stream_timeout` per model https://docs.litellm.ai/docs/proxy/load_balancing#custom-timeouts-stream-timeouts---per-model
- Stability: Added testing for reading config.yaml on the proxy
- *NEW* `/model/new` + `/model/info` endpoints - Add new models + Get model info without restarting proxy.
- Custom user auth - https://github.com/BerriAI/litellm/issues/898#issuecomment-1826396106
- Key Security -> keys now stored as just hashes in the db
- user id accepted + passed to OpenAI/Azure

`litellm` Package
- Specify `kwargs` for Redis Cache https://github.com/BerriAI/litellm/commit/9ba17657ad664a21b5e91259a152db58540be024
- Fixes for Sagemaker + Palm Streaming
- Support for async custom callbacks - https://docs.litellm.ai/docs/observability/custom_callback#async-callback-functions
- Major improvements to stream chunk builder - support for parallel tool calling, system fingerprints, etc.
- Fixes for azure / openai streaming (return complete response object)
- Support for loading keys from azure key vault - https://docs.litellm.ai/docs/secret#azure-key-vault

What's Changed
* docs: adds gpt-3.5-turbo-1106 in supported models by rishabgit in https://github.com/BerriAI/litellm/pull/958
* (feat) Allow installing proxy dependencies explicitly with `pip install litellm[proxy]` by PSU3D0 in https://github.com/BerriAI/litellm/pull/966
* Mention Neon as a database option in docs by Manouchehri in https://github.com/BerriAI/litellm/pull/977
* fix system prompts for replicate by nbaldwin98 in https://github.com/BerriAI/litellm/pull/970

New Contributors
* rishabgit made their first contribution in https://github.com/BerriAI/litellm/pull/958
* PSU3D0 made their first contribution in https://github.com/BerriAI/litellm/pull/966
* nbaldwin98 made their first contribution in https://github.com/BerriAI/litellm/pull/970

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.7.11...v1.10.4

1.7.11

Not secure
💥 LiteLLM Router + Proxy handles 500+ requests/second
💥LiteLLM Proxy - Now handles 500+ requests/second, Load Balance Azure + OpenAI deployments, Track spend per user 💥
Try it here: https://docs.litellm.ai/docs/simple_proxy
🔑 Support for `AZURE_OPENAI_API_KEY` on Azure https://docs.litellm.ai/docs/providers/azure
h/t
[solyarisoftware](https://twitter.com/solyarisoftware)
⚡️ LiteLLM Router can now handle 20% more throughput [https://docs.litellm.ai/docs/routing](https://t.co/aLqlWFE5CM)
📖Improvement to litellm debugging docs h/t
[solyarisoftware](https://twitter.com/solyarisoftware)
[https://docs.litellm.ai/docs/debugging/local_debugging](https://t.co/D4UhfS1AUP)


**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.7.1...v1.7.11

1.7.1

Not secure
What's Changed
- **🚨 LiteLLM Proxy uses Async completion/embedding calls on this release onwards - this led to 30x more throughput for embedding/completion calls**



New Contributors
* guspan-tanadi made their first contribution in https://github.com/BerriAI/litellm/pull/851
* Manouchehri made their first contribution in https://github.com/BerriAI/litellm/pull/880
* maqsoodshaik made their first contribution in https://github.com/BerriAI/litellm/pull/884
* okotek made their first contribution in https://github.com/BerriAI/litellm/pull/885
* kumaranvpl made their first contribution in https://github.com/BerriAI/litellm/pull/902

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v1.1.0...v1.7.1

1.1.0

Not secure
- Requires `openai>=1.0.0`
- `openai.InvalidRequestError` → `openai.BadRequestError`
- `openai.ServiceUnavailableError` → `openai.APIStatusError`
- *NEW* litellm client, allow users to pass api_key
- `litellm.Litellm(api_key="sk-123")`
- response objects now inherit from `BaseModel` (prev. `OpenAIObject`)
- *NEW* default exception - `APIConnectionError` (prev. `APIError`)
- litellm.get_max_tokens() now returns an int not a dict
python
max_tokens = litellm.get_max_tokens("gpt-3.5-turbo") returns an int not a dict
assert max_tokens==4097


Other updates
* Update function calling docs by kevinjyee in https://github.com/BerriAI/litellm/pull/673
* Fix data being overwritten by mc-marcocheng in https://github.com/BerriAI/litellm/pull/679
* Updating the docker image builder for GitHub Action by coconut49 in https://github.com/BerriAI/litellm/pull/678
* fix: bugs in traceloop integration by nirga in https://github.com/BerriAI/litellm/pull/647
* Router aembedding by mc-marcocheng in https://github.com/BerriAI/litellm/pull/691
* support release and debug params for langfuse client by SlapDrone in https://github.com/BerriAI/litellm/pull/695
* docs error ==> openai.error instead of openai.errors by josearangos in https://github.com/BerriAI/litellm/pull/700
* refactor Contributing to documentation steps by josearangos in https://github.com/BerriAI/litellm/pull/713
* Fix Router.set_model_list & Avoid overwriting litellm_params by mc-marcocheng in https://github.com/BerriAI/litellm/pull/706
* Update Together AI pricing by dedeswim in https://github.com/BerriAI/litellm/pull/724
* Update README.md by chinmay7016 in https://github.com/BerriAI/litellm/pull/727
* Router.get_available_deployment: Handle empty input edge case by mc-marcocheng in https://github.com/BerriAI/litellm/pull/729
* Fix caching for Router by karvetskiy in https://github.com/BerriAI/litellm/pull/722
* support for custom bedrock runtime endpoint by canada4663 in https://github.com/BerriAI/litellm/pull/717
* Use supplied headers by stanfea in https://github.com/BerriAI/litellm/pull/741
* Docker Hub image is built for ARM64 only by morgendigital in https://github.com/BerriAI/litellm/pull/734
* doc name chagne by kylehh in https://github.com/BerriAI/litellm/pull/764
* fix: fix bug for the case --model is not specified by clalanliu in https://github.com/BerriAI/litellm/pull/781
* add custom open ai models to asyncio call by PrathamSoni in https://github.com/BerriAI/litellm/pull/789
* Fix bad returns in get_available_deployment by nathankim7 in https://github.com/BerriAI/litellm/pull/790
* Improve message trimming by duc-phamh in https://github.com/BerriAI/litellm/pull/787
* build(deps): bump postcss from 8.4.27 to 8.4.31 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/804
* build(deps): bump urllib3 from 2.0.5 to 2.0.7 by dependabot in https://github.com/BerriAI/litellm/pull/805
* build(deps): bump babel/traverse from 7.22.10 to 7.23.3 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/806
* Fix ServiceUnavailableError super.__init__ error by jackmpcollins in https://github.com/BerriAI/litellm/pull/813
* Update Together prices by dedeswim in https://github.com/BerriAI/litellm/pull/814
* need to re-attempt backoff and yaml imports if the first import attempt fails by kfsone in https://github.com/BerriAI/litellm/pull/820
* Fix typo for initial_prompt_value and too many values to unpack error by rodneyxr in https://github.com/BerriAI/litellm/pull/826
* Bedrock llama by dchristian3188 in https://github.com/BerriAI/litellm/pull/811
* build(deps): bump sharp from 0.32.5 to 0.32.6 in /docs/my-website by dependabot in https://github.com/BerriAI/litellm/pull/832

New Contributors
* kevinjyee made their first contribution in https://github.com/BerriAI/litellm/pull/673
* mc-marcocheng made their first contribution in https://github.com/BerriAI/litellm/pull/679
* SlapDrone made their first contribution in https://github.com/BerriAI/litellm/pull/695
* josearangos made their first contribution in https://github.com/BerriAI/litellm/pull/700
* dedeswim made their first contribution in https://github.com/BerriAI/litellm/pull/724
* chinmay7016 made their first contribution in https://github.com/BerriAI/litellm/pull/727
* karvetskiy made their first contribution in https://github.com/BerriAI/litellm/pull/722
* stanfea made their first contribution in https://github.com/BerriAI/litellm/pull/741
* morgendigital made their first contribution in https://github.com/BerriAI/litellm/pull/734
* clalanliu made their first contribution in https://github.com/BerriAI/litellm/pull/781
* PrathamSoni made their first contribution in https://github.com/BerriAI/litellm/pull/789
* nathankim7 made their first contribution in https://github.com/BerriAI/litellm/pull/790
* duc-phamh made their first contribution in https://github.com/BerriAI/litellm/pull/787
* dependabot made their first contribution in https://github.com/BerriAI/litellm/pull/804
* jackmpcollins made their first contribution in https://github.com/BerriAI/litellm/pull/813
* kfsone made their first contribution in https://github.com/BerriAI/litellm/pull/820
* rodneyxr made their first contribution in https://github.com/BerriAI/litellm/pull/826
* dchristian3188 made their first contribution in https://github.com/BerriAI/litellm/pull/811

**Full Changelog**: https://github.com/BerriAI/litellm/compare/v0.11.1...v1.1.0

Page 92 of 93

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.