Openai-messages-token-helper

Latest version: v0.1.5

Safety actively analyzes 641872 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.1.5

- Remove spurious `print` call when counting tokens for function calling.

0.1.4

- Add support and tests for gpt-4o, which has a different tokenizer.

0.1.3

- Use openai type annotations for more precise type hints, and add a typing test.

0.1.2

- Add `py.typed` file so that mypy can find the type hints in this package.

0.1.0

- Add `count_tokens_for_system_and_tools` to count tokens for system message and tools. You should count the tokens for both together, since the token count for tools varies based off whether a system message is provided.
- Updated `build_messages` to allow for `tools` and `tool_choice` to be passed in.
- Breaking change: Changed `new_user_message` to `new_user_content` in `build_messages` for clarity.

0.0.6

- Add keyword argument `fallback_to_default` to `build_messages` function to allow for defaulting to the CL100k token encoder and minimum GPT token limit if the model is not found.
- Fixed usage of `past_messages` argument of `build_messages` to not skip the last past message. (New user message should *not* be passed in)

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.