Llama-cpp-http

Latest version: v0.3.3

Safety actively analyzes 682387 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 4

0.3.3

Fixed:
- Cropping first character from llm output.

0.3.2

Added:
- server: DEVICE_SHUTDOWN_TIMEOUT, timeout required for device to shutdown

Changed:
- server: after each subprocess invocation to llama.cpp wait DEVICE_SHUTDOWN_TIMEOUT seconds

0.3.1

Changed:
- get_app() is now sync instead of async

Fixed:
- server: removed `devices_wss` and `devices_procs` that could cause GPU memory leak

0.3.0

Added:
- client/langchain_llama_cpp_client.py
- client/langchain_llama_cpp_embeddings_client.py
- misc/example_client_langchain_embedding.py

Changed:
- llama.cpp instructions
- `client.py` is not `client` package
- renamed misc/example_client_call_2.py to example_client_call_react.py

Fixed:
- misc/example_client_langchain_stream.py

Removed:
- misc/example_client_stream_codellama.py

0.2.13

0.2.12

Page 1 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.