Llama-cpp-http

Latest version: v0.3.3

Safety actively analyzes 682404 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 4

0.2.5

Changed:
- Forced immediate kill of subprocess. In that case info/stderr is not returned.

0.2.4

Added:
- llama timing info.

Changed:
- Default --platforms-devices "0:0"

Fixed:
- Token unicode string decoding.

0.2.3

Changed:
- Do not eagerly load models.

0.2.2

Fixed:
- Do not import server on package import.

0.2.1

Changed:
- Eager/optimistic model loading.
- Disable llama.cpp log traces.

0.2.0

Added:
- Using `uvloop` speed interaction with `llama.cpp` ~2x.

Changed:
- Does not return `info` field in responses as `stderr` from `llama.cpp`.

Removed:
- `pyopencl` usage at in code.
- Caching using `PonyORM` and `sqlite3`.

Page 3 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.