Kani

Latest version: v1.4.0

Safety actively analyzes 723650 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 6

0.1.0

BREAKING CHANGES

*These should hopefully be the last set of breaking changes until v1.0. We're finalizing some of the attribute names for clarity and publication.*

- renamed `Kani.always_include_messages` to `Kani.always_included_messages`

Features & Improvements

- `ai_function`s with synchronous signatures now run in a thread pool in order to prevent blocking the asyncio event loop
- OpenAI: Added the ability to specify the API base and additional headers (e.g. for proxy APIs).
- Various documentation improvements

0.0.3

BREAKING CHANGES
- Renamed `Kani.get_truncated_chat_history` to `Kani.get_prompt`

Additions & Improvements
- Added `CTransformersEngine` and `LlamaCTransformersEngine` (thanks Maknee!)
- Added a lower-level `Kani.get_model_completion` to make a prediction at the current chat state (without modifying the chat history)
- Added the `auto_truncate` param to `ai_function` to opt in to kani trimming long responses from a function (i.e., responses that do not fit in a model's context)
- Improved the internal handling of tokens when the chat history is directly modified
- `ChatMessage.[role]()` classmethods now pass kwargs to the constructor
- LLaMA: Improved the fidelity of non-strict-mode LLaMA prompting
- OpenAI: Added support for specifying an OpenAI organization and configuring retry
- Many documentation improvements

Fixes
- OpenAI message length could return too short on messages with no content
- Other minor fixes and improvements

0.0.2

- Add `chat_in_terminal_async` for async environments (e.g. Google Colab)
- Add quickstart Colab notebook

0.0.1

Initial release!

Page 6 of 6

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.