Llama-cpp-cffi

Latest version: v0.1.21

Safety actively analyzes 681874 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

0.1.15

Added:
- `SmolLM-1.7B-Instruct-v0.2` examples.

Changed:
- Updated `llama.cpp`.

0.1.14

Fixed:
- Vulkan detection.

0.1.13

Fixed:
- CUDA and Vulkan detection.

0.1.12

Added:
- Build `vulkan_1_x` for general GPU.
- Build `cuda 12.4.1` as default.

Changed:
- Renamed examples for TinyLlama (chat, tool calling) and OpenAI.
- Updated demo models definitions.
- Updated examples (chat, tool calling).
- `get_special_tokens` not supports parameter `force_standard_special_tokens: bool=False` which bypasses tokenizer's special tokens with standard/common ones.
- Build `cuda 12.5.1` as additional build target but packaged on PyPI.
- Build `cuda 12.6` as additional build target but packaged on PyPI.
- Build `openblas` as additional build target but packaged on PyPI.

Fixed:
- Handle `Options.no_display_prompt` on Python side.

0.1.11

Changed:
- OpenAI: allow import of `routes` and `v1_chat_completions` handler.
- `examples/demo_0.py`, tool calling example.

0.1.10

Added:
- In `openai`, support for `prompt` and `extra_body`. Reference: https://github.com/openai/openai-python/blob/195c05a64d39c87b2dfdf1eca2d339597f1fce03/src/openai/resources/completions.py#L41
- Pass `llama-cli` options to `openai`.
- `util` module with `is_cuda_available` function.
- `openai` supports both `prompt` and `messages`. Reference: https://github.com/openai/openai-python/blob/195c05a64d39c87b2dfdf1eca2d339597f1fce03/src/openai/resources/completions.py#L45

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.