Dandy

Latest version: v0.14.2

Safety actively analyzes 723976 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 7

0.13.0

Breaking

- Removed the `assistant_str_prompt_to_str` method and sub methods from `LlmService`.
- Everything should be processed through the `LlmBot` going forward.

Features

- Added a new processor called `LlmMap` that uses the `Map` object to easily create contextual relationships to python objects.
- Debug recorder now shows the JSON schema that was sent to the llm service.
- New decorator called `debug_recorder_to_html(debug_name: str)` that allows you to easily wrap a function or method.

Changes

- Removed the Contrib Selection llm bot.

Fixes

- Fixed a caching to be more robust and understand changes with in dandy processors for more accurate caching.
- Drastically improved the testing of the OpenAI llm service.

0.12.0

Features

- The `BaseLlmBot` now supports `images` and `image_files` in the `process_prompt_to_intel` method.
- Make sure you have the `config` using a compatible llm model that supports vision.

Fixes

- Fixed the hash key generation process to work with more data types and maintain better consistency.
- Improved testing of the OpenAI llm service.

0.11.3

Changes

- Refactored internal project structure to get ready for next AI features (TTS, STT, VISION ETC).

Fixes

- Fixed a bug with clearing non-existent or empty caches.

0.11.2

Notes

- This release was combined with v0.11.1 for a single release.

Features

- Updated example with better use of `Prompt` objects.
- Added `to_markdown_file` method for the `Book` class in the example.
- Updated caching objects to be easier to clear.
- `dandy.cache.MemoryCache` and `dandy.cache.SqliteCache` have class method `clear` and `destroy`.

Fixes

- Added text to global service prompt to improve response quality.
- Fix bug with updating non-flat intel objects.

0.11.0

Breaking

- All uses of the `process_prompt_to_intel` method now require you to specify either an `intel_class` or an `intel_object` argument.

Features

- A new example has been created that is much easier to follow and showcases the features of Dandy.
- Added a new `Intel` class called `BaseListIntel` that is used to create an iterable intel object that behaves like a `list`.
- When using `process_prompt_to_intel` you can now submit a `intel_class` or `intel_object`.
- Submitting a class will return you a new instance of the class.
- Submitting the object will return you a modified copy of the object.
- The method `process_prompt_to_intel` now supports `include_fields` and `exclude_fields` which allow you to only include or exclude fields from the intel object or class.
- Caching is now supported through the `cache_to_memory` and `cache_to_sqlite` decorators.
- Check out the `dandy/default_settings.py` file to see how to configure caching beyond the defaults.
- Decorator argument `cache_name` which can be used to separate the cache objects / files, default is `dandy`.
- Decorator argument `limit` which can be used to set an upper limit on the number of items that can be cached, default is in `settings`.

Changes

- Removed the old examples (Cookie Recipe and Pirate Story)
- Exceptions are now being divided into two categories: `DandyCriticalException` and `DandyRecoverableException`.
- Both of this will inherit from `DandyException`.
- The `DandyRecoverableException` will be used to allow developers to attempt recovering from exceptions safely.
- The `DandyCriticalException` will be for when an exception is unrecoverable and must be handled.

Fixes

- Update the `process_to_intel` method used throughout the project to properly reflect the `postfix_system_prompt` argument.
- Added missing return to the `__str__` method in the `Url` class (Thanks Pyright).

0.10.0

Documentation

- We have an initial working documentation website that can be viewed at https://dandysoftware.com
- Our new website is powered by mkdocs, mkdocstrings, mkdocs-include-markdown-plugin and mkdocs-material.

Features

- In the `LLM_CONFIGS` in your settings the `TYPE`, `HOST`, `PORT` AND `API_KEY` from the `DEFAULT` config will now flow to the other configs if they are not specificed.
- Added --version to the CLI interface for dandy.
- The OpenAI llm service now use json_schema for the response format.
- The OllamaAI llm service now use json_schema for the response format.

Breaking

- Renamed `Bot` to `BaseBot`
- Renamed `Workflow` to `BaseWorkflow`
- The LLM API for Ollama now only works with 0.5.0 or greater.
- The LLM API for OpenAI now only works with gpt-4o-mini or greater.

Changes

- Rebuilt the document structure.

Fixes

- Dandy CLI now properly create default settings based on your environment variables.
- Fixed the way the settings are handled so they properly show error messages for missing settings.

Page 2 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.