Llama-cpp-cffi

Latest version: v0.4.43

Safety actively analyzes 723625 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 14

0.4.36

Changed:
- `llama.cpp` revision `06c2b1561d8b882bc018554591f8c35eb04ad30e`

0.4.35

Changed:
- CUDA_ARCHITECTURES `50;61;70;75;80;86;89;90;100;101;120`
- `llama.cpp` revision `08d5986290cc42d2c52739e046642b8252f97e4b`

0.4.34

Fixed:
- Memory leak on stopping iteration/completion

0.4.33

Fixed:
- Handle `MemoryError` in `model_init`
- Handle `MemoryError` in `context_init`

0.4.32

Changed:
- `llama.cpp` revision `9626d9351a6dfb665400d9fccbda876a0a96ef67`

0.4.31

Added:
- `CompletionsOptions.force_model_reload` to force model reload on every server request

Fixed:
- Stop condition in `Model.completions`
- Fixed (Vulkan related) GPU memory leaks

Page 2 of 14

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.