Onprem

Latest version: v0.0.36

Safety actively analyzes 622414 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 5

0.30.0

0.29.0

0.0.36

new:
- Support for OpenAI models (55)

changed
- `LLM.prompt`, `LLM.ask`, and `LLM.chat` now accept extra `**kwargs` that are sent directly to model (54)

fixed:
- N/A

0.0.35

new:
- N/A

changed
- Updates for `langchain>=0.1.0` (which is now minimum version)

fixed:
- N/A

0.0.34

new:
- Use [Zephyr-7B](https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF) as default model in `webapp.yml`. (#52)

changed
- Added `stop` parameter to `LLM.prompt` (overrides `stop` parameter supplied to constructor) (53)

fixed:
- N/A

0.0.33

new:
- N/A

changed
- Added `prompt_template` parameter to `LLM` constructor (51)
- Added `update_max_tokens` and `update_stop` methods to `LLM` for dynamic adjustments during prompt experiments

fixed:
- Explicitly set `offload_kqv` to ensure GPUs are fully utilized (50)

Page 1 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.