- Debug recorder correctly renders OpenAI service request messages. - Added more testing for more complex JSON Schemas. - Fixed a bug where the `Prompt.prompt` method would not accept strings properly.
0.13.2
Fixes
- Fixed problem with `BaseIntel` required fields using nested includes and excludes.
0.13.1
Fixes
- Fixed bug with using include and exclude on intel objects that did not validate filled and empty fields properly.
0.13.0
Breaking
- Removed the `assistant_str_prompt_to_str` method and sub methods from `LlmService`. - Everything should be processed through the `LlmBot` going forward.
Features
- Added a new processor called `LlmMap` that uses the `Map` object to easily create contextual relationships to python objects. - Debug recorder now shows the JSON schema that was sent to the llm service. - New decorator called `debug_recorder_to_html(debug_name: str)` that allows you to easily wrap a function or method.
Changes
- Removed the Contrib Selection llm bot.
Fixes
- Fixed a caching to be more robust and understand changes with in dandy processors for more accurate caching. - Drastically improved the testing of the OpenAI llm service.
0.12.0
Features
- The `BaseLlmBot` now supports `images` and `image_files` in the `process_prompt_to_intel` method. - Make sure you have the `config` using a compatible llm model that supports vision.
Fixes
- Fixed the hash key generation process to work with more data types and maintain better consistency. - Improved testing of the OpenAI llm service.
0.11.3
Changes
- Refactored internal project structure to get ready for next AI features (TTS, STT, VISION ETC).
Fixes
- Fixed a bug with clearing non-existent or empty caches.