Llama-cpp-cffi

Latest version: v0.4.40

Safety actively analyzes 715170 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 13

0.4.16

Added:
- Dynamically load/unload models while executing prompts in parallel.

Changed:
- `llama.cpp` revision `adc5dd92e8aea98f5e7ac84f6e1bc15de35130b5`

0.4.15

Changed:
- `llama.cpp` revision `0ccd7f3eb2debe477ffe3c44d5353cc388c9418d`

Fixed:
- CUDA architectures: 61, 70, 75, 80, 86, 89, 90

0.4.14

Changed:
- `llama.cpp` revision `0ccd7f3eb2debe477ffe3c44d5353cc388c9418d`

Fixed:
- CUDA architectures: all (including: 61, 70, 75, 80, 86, 89, 90)

0.4.13

Changed:
- `llama.cpp` revision `bbf3e55e352d309573bdafee01a014b0a2492155`

0.4.12

Changed:
- `llama.cpp` revision `091592d758cb55af7bfadd6c397f61db387aa8f3`

Fixed:
- `gguf_*` missing symbols from `_llama_cpp_*` shared libraries
- CUDA default arch `-arch=sm_61`

0.4.11

Changed:
- `llama.cpp` revision `44d1e796d08641e7083fcbf37b33c79842a2f01e`

Page 5 of 13

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.