Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688600 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 19 of 22

0.1.74

Not secure
- (server) OpenAI style error responses

0.1.73

Not secure
- (server) Add rope parameters to server settings

0.1.72

Not secure
- (llama.cpp) Update llama.cpp added custom_rope for extended context lengths

0.1.71

Not secure
- (llama.cpp) Update llama.cpp

- (server) Fix several pydantic v2 migration bugs

0.1.70

Not secure
- (Llama.create_completion) Revert change so that `max_tokens` is not truncated to `context_size` in `create_completion`
- (server) Fixed changed settings field names from pydantic v2 migration

0.1.69

Not secure
- (server) Streaming requests can are now interrupted pre-maturely when a concurrent request is made. Can be controlled with the `interrupt_requests` setting.
- (server) Moved to fastapi v0.100.0 and pydantic v2
- (docker) Added a new "simple" image that builds llama.cpp from source when started.
- (server) performance improvements by avoiding unnecessary memory allocations during sampling

Page 19 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.