Kani

Latest version: v1.2.4

Safety actively analyzes 688823 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 6

0.6.0

As of Nov 6, 2023, OpenAI added the ability for a single assistant message to request calling multiple functions in
parallel, and wrapped all function calls in a `ToolCall` wrapper. In order to add support for this in kani while
maintaining backwards compatibility with OSS function calling models, a `ChatMessage` now actually maintains the
following internal representation:

`ChatMessage.function_call` is actually an alias for `ChatMessage.tool_calls[0].function`. If there is more
than one tool call in the message, when trying to access this property, kani will raise an exception.

To translate kani's FUNCTION message types to OpenAI's TOOL message types, the OpenAIEngine now performs a translation based on binding free tool call IDs to following FUNCTION messages deterministically.

Breaking Changes

To the kani end user, there should be no change to how functions are defined and called. One breaking change was necessary:

- `Kani.do_function_call` and `Kani.handle_function_call_exception` now take an additional `tool_call_id` parameter, which may break overriding functions. The documentation has been updated to encourage overriders to handle `*args, **kwargs` to prevent this happening again.

New Features

kani can now handle making multiple function calls in parallel if the model requests it. Rather than returning an ASSISTANT message with a single `function_call`, an engine can now return a list of `tool_calls`. kani will resolve these tool calls in parallel using asyncio, and add their results to the chat history in the order of the list provided.

Returning a single `function_call` will continue to work for backwards compatibility.

0.5.1

- OpenAI: The OpenAIClient (internal class used by OpenAIEngine) now expects `OpenAIChatMessage`s as input rather than `kani.ChatMessage` in order to better type-validate API requests
- OpenAI: Updated token estimation to better reflect current token counts returned by the API

0.5.0

New Feature: Message Parts API
The Message Parts API is intended to provide a foundation for future multimodal LLMs and other engines that require engine-specific input without compromising kani's model-agnostic design. This is accomplished by allowing `ChatMessage.content` to be a list of `MessagePart` objects, in addition to a string.

*This change is fully backwards-compatible and will not affect existing code.*

When writing code with compatibility in mind, the `ChatMessage` class exposes `ChatMessage.text` (always a string or None) and `ChatMessage.parts` (always a list of message parts), which we recommend using instead of `ChatMessage.content`. These properties are dynamically generated based on the underlying content, and it is safe to mix messages with different content types in a single Kani.

Generally, message part classes are defined by an engine, and consumed by the developer. Message parts can be used in any role’s message - for example, you might use a message part in an assistant message to separate out a chain of thought from a user reply, or in a user message to supply an image to a multimodal model.

For more information, see the [Message Parts documentation](https://kani.readthedocs.io/en/latest/advanced.html#message-parts).

*Up next: we're adding support for multimodal vision-language models like LLaVA and GPT-Vision through a kani extension!*

Improvements
- LLaMA 2: Improved the prompting in non-strict mode to group consecutive user/system messages into a single `[INST]` wrapper. See [the tests](https://github.com/zhudotexe/kani/blob/main/tests/test_llama2_prompt.py) for how kani translates consecutive message types into the LLaMA prompt.
- Other documentation and minor improvements

0.4.0

BREAKING CHANGES
- `Kani.full_round` now emits *every* message generated during the round, not just assistant messages
- This means that you will need to handle `FUNCTION` messages, and potentially `SYSTEM` messages from a function exception handler.
- `Kani.full_round_str`'s default behaviour is unchanged.
- `Kani.full_round_str` now takes in a `message_formatter` rather than a `function_call_formatter`
- By default, this handler only returns the contents of `ASSISTANT` messages.
- `Kani.do_function_call` now returns a `FunctionCallResult` rather than a `bool`
- To migrate any overriding functions, you should change the following:
- Rather than calling `Kani.add_to_history` in the override, save the ChatMessage to a variable
- Update the return value from a boolean to `FunctionCallResult(is_model_turn=<old return value>, message=<message from above>)`
- `Kani.handle_function_call_exception` now returns a `ExceptionHandleResult` rather than a `bool`
- To migrate any overriding functions, you should change the following:
- Rather than calling `Kani.add_to_history` in the override, save the ChatMessage to a variable
- Update the return value from a boolean to `ExceptionHandleResult(should_retry=<old return value>, message=<message from above>)`

Improvements
- Added `kani.utils.message_formatters`
- Added `kani.ExceptionHandleResult` and `kani.FunctionCallResult`
- Documentation improvements

Fixes
- Fixed an issue where `ChatMessage.copy_with` could cause unset values to appear in JSON serializations

0.3.4

Improvements
- Updated dependencies to allow more recent versions
- The documentation now shows fully-qualified class names in reference sections
- Added `.copy_with` method to ChatMessage and FunctionCall to make updating chat history easier
- Various documentation updates

0.3.3

Improvements
- Added a warning in `Kani.chat_round` to use `Kani.full_round` when AI functions are defined
- Added examples in Google Colab
- Other documentation improvements

Fixes
- Fixed an issue where the ctransformers engine could overrun its context length (e.g. see https://github.com/zhudotexe/kani/actions/runs/6152842183/job/16695721588)

Page 4 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.