Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688600 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 21 of 22

0.1.62

Not secure
- Metal support working
- Cache re-enabled

0.1.61

Not secure
- Fix broken pip installation

0.1.60

NOTE: This release was deleted due to a bug with the packaging system that caused pip installations to fail.

- Truncate max_tokens in create_completion so requested tokens doesn't exceed context size.
- Temporarily disable cache for completion requests

0.1.59

Not secure
- (llama.cpp) k-quants support
- (server) mirostat sampling parameters to server
- Support both `.so` and `.dylib` for `libllama` on MacOS

0.1.58

- (llama.cpp) Metal Silicon support

0.1.57

Not secure
- (llama.cpp) OpenLlama 3B support

Page 21 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.