Llama-cpp-cffi

Latest version: v0.4.43

Safety actively analyzes 723625 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 14

0.4.30

Fixed:
- Stop condition in `Model.completions`

0.4.29

Fixed:
- Stop condition in `Model.completions`

0.4.28

Fixed:
- In `server.py`, lock is now used correctly

0.4.27

Added:
- `CompletionsOptions.stop` is optional strgin which triggers stop of generation

Changed:
- In `llama_cpp.py`, use `threading.Lock` instead of `DummyLock`
- In `server.py`, use `asyncio.Lock` instead of `threading.Lock`

0.4.26

Changed:
- Updated all requirements

0.4.25

Changed:
- Reimplemented `is_cuda_available` and `is_vulkan_available`

Page 3 of 14

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.