Llama-cpp-cffi

Latest version: v0.4.40

Safety actively analyzes 715128 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 13

0.1.16

Changed:
- Updated `llama.cpp`.

0.1.15

Added:
- `SmolLM-1.7B-Instruct-v0.2` examples.

Changed:
- Updated `llama.cpp`.

0.1.14

Fixed:
- Vulkan detection.

0.1.13

Fixed:
- CUDA and Vulkan detection.

0.1.12

Added:
- Build `vulkan_1_x` for general GPU.
- Build `cuda 12.4.1` as default.

Changed:
- Renamed examples for TinyLlama (chat, tool calling) and OpenAI.
- Updated demo models definitions.
- Updated examples (chat, tool calling).
- `get_special_tokens` not supports parameter `force_standard_special_tokens: bool=False` which bypasses tokenizer's special tokens with standard/common ones.
- Build `cuda 12.5.1` as additional build target but packaged on PyPI.
- Build `cuda 12.6` as additional build target but packaged on PyPI.
- Build `openblas` as additional build target but packaged on PyPI.

Fixed:
- Handle `Options.no_display_prompt` on Python side.

0.1.11

Changed:
- OpenAI: allow import of `routes` and `v1_chat_completions` handler.
- `examples/demo_0.py`, tool calling example.

Page 11 of 13

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.