Onprem

Latest version: v0.12.1

Safety actively analyzes 723158 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 10

0.1.4

new:
- OCR support (75)

changed
- Added `Ingester.store_documents` method (36,77)

fixed:
- switch to `langchain_huggingface` and `langchain_chroma` (78)

0.1.3

new:
- N/A

changed
- N/A

fixed:
Added `preproc_fn` to `Extractor.apply` (74)

0.1.2

new:
- N/A

changed
- N/A

fixed:
- Segment needs to accept arguments in extractor pipeline (70)

0.1.1

new:
- N/A

changed
- Add clean function to `Extractor.apply` (69)

fixed:
- Remove BOS token from default prompt (67)
- Remove call to `db.persist` (68)

0.1.0

new:
- Use OnPrem.LLM with OpenAI-compatible REST APIs (61)
- information extraction pipeline (64)
- experimental support for Azure OpenAI (63)
- Docker support
- Few-Shot classification pipeline (66)

changed
- change default model to Mistral (65)
- allow installation of onprem without llama-cpp-python for easier use with LLMs served through
REST APIs (62)
- Added `ignore_fn` argument to `LLM.ingest` to allow more control over ignoring certain files (58)
- Added `Ingester.get_ingested_files` to show files ingested into vector database (59)

fixed:
- If encountering a loading error when processing a file, skip and continue instead of halting (60)
- Add check for partially download files (49)

0.0.36

new:
- Support for OpenAI models (55)

changed
- `LLM.prompt`, 'LLM.ask`, and `LLM.chat` now accept extra `**kwargs` that are sent diretly to model (54)

fixed:
- N/A

Page 5 of 10

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.