Cappr

Latest version: v0.9.6

Safety actively analyzes 701595 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 7

0.6.5

Breaking changes

None

New features

None

Bug fixes

* Previously, it was possible for an HF or Llama CPP function call to modify the inputted model/tokenizer if an exception were raised. (Technically, the cause is that my context managers didn't wrap the context in a try-finally block. Now they do.)

0.6.4

Breaking changes

* The default `batch_size` in `cappr.huggingface` is now `2`, not `32`
* The implementation for `cappr.huggingface.classify_no_batch` is now in `cappr.huggingface.classify_no_batch_no_cache`

New features

* `cappr.huggingface.classify_no_batch` now caches the prompt, which makes it much faster. It can also cache shared instructions or exemplars for prompts using the new [context manager](https://cappr.readthedocs.io/en/latest/cappr.huggingface.classify_no_batch.html#cappr.huggingface.classify_no_batch.cache). See this functionality in action in the [Banking 77 demo](https://github.com/kddubey/cappr/blob/main/demos/huggingface/banking_77_classes.ipynb)

Bug fixes

None

0.6.3

Breaking changes

None

New features

* For `cappr.llama_cpp.classify`, cache shared instructions or exemplars for many prompts using the new [context manager](https://cappr.readthedocs.io/en/latest/cappr.llama_cpp.classify.html#cappr.llama_cpp.classify.cache).

Bug fixes

None

0.6.2

Breaking changes

None

New features

* Install all extra dependencies to run any model format using:


pip install "cappr[all]"


Bug fixes

None

0.6.1

Breaking changes

* `cappr.openai.token_logprobs` now prepends a space to each text by default. Set `end_of_prompt=""` if you don't want that

New features

None

Bug fixes

* `cappr.openai`'s (still highly experimental) discount feature works for a wider range of `completions`

0.6.0

Breaking changes

None

New features

* To minimize memory usage, use `cappr.huggingface.classify_no_batch`. See [this section](https://cappr.readthedocs.io/en/latest/select_a_language_model.html#which-cappr-huggingface-module-should-i-use) of the docs. I ended up needing this feature to [demo](https://github.com/kddubey/cappr/blob/main/demos/huggingface/autoawq.ipynb) Mistral 7B on a T4 GPU

Bug fixes

* `show_progress_bar=False` now works, my b

Page 4 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.