Langchain-decorators

Latest version: v0.6.0

Safety actively analyzes 634654 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 5

0.0.11

- fixed streaming
- multiple little bugfixes
- option to set the expected generated token count as a hint for LLM selector
- add argument schema option for llm_function

0.0.10

- async screaming callback support
- LlmSelector for automatic selection of LLM based on the model context window and prompt length

0.0.9

- fix some scenarios of LLM response that raised error
- save AIMessage with function call in output wrapper
- fix logging that we are out or stream context, when stream is not on

0.0.8

- support for parsing via OpenAI functions 🚀
- support for controlling function_call
- add BIG_CONTEXT prompt type
- ton of bugfixes

0.0.7

- fixed streaming capture
- better handling for missing docs for llm_function

0.0.6

- fix some issues with async prompts

Page 4 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.