New Features - Expose a system prompt/query wrapper prompt in the service context for open-source LLMs (6647) - Changed default MyScale index format to `MSTG` (7288) - Added tracing to chat engines/agents (7304) - move LLM and embeddings to pydantic (7289)
Bug Fixes / Nits - Improve SQL Query parsing (7283) - Fix loading embed_model from global service context (7284) - Limit langchain version until we migrate to pydantic v2 (7297)
0.8.3
New Features - Added Knowledge Graph RAG Retriever (7204)
Bug Fixes / Nits - accept `api_key` kwarg in OpenAI LLM class constructor (7263) - Fix to create separate queue instances for separate instances of `StreamingAgentChatResponse` (7264)
0.8.2.post1
New Features - Added support for Rockset as a vector store (7111)
Bug Fixes - Fixed bug in service context definition that could disable LLM (7261)
0.8.2
New Features - Enable the LLM or embedding model to be disabled by setting to `None` in the service context (7255) - Resolve nearly any huggingface embedding model using the `embed_model="local:<model_name>"` syntax (7255) - Async tool-calling support (7239)
Bug Fixes / Nits - Updated supabase kwargs for add and query (7103) - Small tweak to default prompts to allow for more general purpose queries (7254) - Make callback manager optional for `CustomLLM` + docs update (7257)
0.8.1
New Features - feat: add node_postprocessors to ContextChatEngine (7232) - add ensemble query engine tutorial (7247)
Smaller Features - Allow EMPTY keys for Fastchat/local OpenAI API endpoints (7224)