Onprem

Latest version: v0.12.1

Safety actively analyzes 723177 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 10

0.0.17

new:
- The `LLM.chat` method supports question-answering with conversational memory. (20)

changed
- `LLM` now accepts a `callbacks` parameter for custom callbacks. (21)
- added additional examples

fixed:
- N/A

0.0.16

new:
- Support for prompt templates in `ask` (17)

changed
- Added `LLM.load_qa` method

fixed:
- batchify input to `Chroma` (18)

0.0.15

new:
- N/A

changed
- N/A

fixed:
- pass `embedding_model_kwargs` and `embedding_encode_kwargs` to `HuggingFaceEmbeddings` (16)

0.0.14

new:
- N/A

changed
- Added `Ingester.get_embeddings` method to access instance of `HuggingFaceEmbeddings`
- Added `chunk_size` and `chunk_overlap` parameters to `Ingester.ingest` and `LLM.ingest` (13)

fixed:
- Check to ensure `source_directory` is a folder in `LLM.ingest` (15)

0.0.13

new:
- N/A

changed
- Accept extra `kwargs` and supply them to `langchain.llms.Llamacpp` (12)
- Add optional argument to specify custom path to vector DB (11)

fixed:
- N/A

0.0.12

new:
- N/A

changed
- Add optional argument to specify custom path to download LLM (5), thanks to rabilrbl

fixed:
- Fixed capitalization in download confirmation (9), thanks to rabilrbl
- Insert [dummy replacement](https://stackoverflow.com/questions/74918614/error-importing-seaborn-module-attributeerror/76760670#76760670) of decorator into numpy

Page 9 of 10

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.