Archytas

Latest version: v1.3.11

Safety actively analyzes 702307 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 6

1.0.5

Added new auto-context functionality that allows providing an auto-context function to the agent that can be run automatically to update the agent's context before queries, etc for use in dynamic situations when state may change outside of the normal conversational flow.

1.0.4

1.0.2

1.0.1

Changes
- Made rich printing optional via the `rich_print` flag or by setting the `DISABLE_RICH_PRINT` environment variable

1.0.0

Breaking Changes:
- reworked how context messages are created/handled. Now all messages are created with `add_context`, which returns an `id` to that message
- message and context messages are now `Message` and `ContextMessage` classes, rather than plain dictionaries
- timed messages can be created by setting the `lifetime` argument when creating a context message
- messages can be deleted with the `clear_context` function
- renamed loop controller values to be more clear:
- `STOP` -> `STOP_SUCCESS`
- `ERROR` -> `STOP_FATAL`

Non-breaking changes
- created dedicated functions for determining which type of tool a given object is (i.e. one of func tool, method tool, class tool, class tool instance)

0.3.0

- added dependency injection
- reference to the underlying agent
- reference to the underlying tool/tool-name
- managing react loop flow
- refactored how decorators modified functions/classes. Now objects retain original functionality and properties for the LLMs use are attached
- Added the ability to create custom spinners/functionality while LLM is "thinking", by passing in a custom python context manager
- made LLM accept API key as either a function argument or environment variable. Check is done at agent instantiation, rather than import time
- Added context management to chat agent message history
- permanent context: messages that permanently stay in the chat
- timed context: messages that stay in the chat for some number of timesteps
- managed context: messages that stay in the chat until a deleter callback is called
- Added a oneshot method on agent for accessing the LLM without including any chat history or context
- fixed issue with prompt not changing to reflect when `ask_user` was set to `False`
- added a python tool for letting the LLM write and run code
- Made `ReActAgent` a subclass of `Agent`
- fixed bug around prompting for tools that take a single argument of dict/list to be forwarded correctly from LLM to function/method
- added a demo video of using archytas with the python tool

Page 5 of 6

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.