Llama-cpp-cffi

Latest version: v0.4.43

Safety actively analyzes 723625 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 14

0.4.24

Changed:
- `llama.server` now uses env vars `LLAMA_CPP_HOST` and `LLAMA_CPP_PORT`

0.4.23

Changed:
- Use `DummyLock` instead of `threading.Lock`
- `llama.cpp` revision `63ac12856303108ee46635e6c9e751f81415ee64`

0.4.22

Changed:
- Async to sync lock in server
- `llama.cpp` revision `5137da7b8c3eaa090476a632888ca178ba109f8a`

0.4.21

Changed:
- CUDA 12.8.0 for x86_64
- CUDA_ARCHITECTURES `61;70;75;80;86;89;90;100;101;120`
- `llama.cpp` revision `73e2ed3ce3492d3ed70193dd09ae8aa44779651d`

0.4.20

Changed:
- New repo at https://github.com/ggml-org/llama.cpp.git
- `llama.cpp` revision `0f2bbe656473177538956d22b6842bcaa0449fab`

0.4.19

Changed:
- `llama.cpp` revision `d774ab3acc4fee41fbed6dbfc192b57d5f79f34b`
- Build process `manylinux_2_28` -> `manylinux_2_34`
- Build for `aarch64` platform WIP

Page 4 of 14

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.