- fix typing error in caching (caused problems with python 3.10+) - fixed nesting context problem - disabled exception on get cache that has not been initialized (reinitializing instead)
0.0.7
- one more fix for langchain deepcopying and context sharing - fixing not propagating some additional info
0.0.6
- fixed potential problems caused by deepcopying of handler in langchain which could lead to breaking the singleton principle - fix cached llm - better prompt logging for chat
0.0.5
- fixed dismissed llm_output in LlmPrompt metadata - fixed dismissing the memory parameter in for chat models
0.0.4
- fix typo in CachedChatLLM which caused problems
0.0.3
- support for langchain 0.0.155 (langchain contains breaking changes... use this version of promptwatch to work with langchain>=0.0.155) - support for semantical caching