Llama-cpp-python

Latest version: v0.3.8

Safety actively analyzes 723119 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 21 of 22

0.1.65

Not secure
- (llama.cpp) Fix struct misalignment bug

0.1.64

Not secure
- (llama.cpp) Update llama.cpp
- Fix docs for seed. Set -1 for random.

0.1.63

Not secure
- (llama.cpp) Add full gpu utilisation in CUDA
- (llama.cpp) Add get_vocab
- (llama.cpp) Add low_vram parameter
- (server) Add logit_bias parameter

0.1.62

Not secure
- Metal support working
- Cache re-enabled

0.1.61

Not secure
- Fix broken pip installation

0.1.60

NOTE: This release was deleted due to a bug with the packaging system that caused pip installations to fail.

- Truncate max_tokens in create_completion so requested tokens doesn't exceed context size.
- Temporarily disable cache for completion requests

Page 21 of 22

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.