Features - Added JSON mode support for Gemini provider - Fixed optional dependencies in pyproject.toml - Updated documentation to reflect JSON mode support - Added provider support matrix with JSON mode column
Documentation - Added Gemini example with JSON mode support - Updated provider support matrix to include JSON mode support - Updated both README files with new features
0.4.1
What's Changed
Bug Fixes - Standardized LLM provider initialization across all providers - Fixed max_tokens handling in Anthropic provider - Converted __init__ to __post_init__ in OpenAI, Groq, and Ollama providers - Added super().__post_init__() call to ensure proper config initialization
0.4.0
Features: - Add to_langchain() method to LanguageModel base class
Bug Fixes: - Properly handle unsupported parameters in Anthropic client - Fix optional Groq imports - Fix unexpected options for Anthropic - Fix import issue for Ollama
Misc: - Bump version to 0.4.0
0.3.6
What's Changed
* Fix handling of unsupported parameters in Anthropic client * Add explicit parameter filtering to prevent TypeError with proxies argument
0.3.5
What's Changed
* Fix Anthropic provider configuration handling * Make Groq provider tests conditional to support optional dependencies * Update test configuration for better dependency management
0.3.4
What's Changed
Bug Fixes * Fixed import issue with Groq provider - Removed direct import of GroqLanguageModel from __init__.py - Now using lazy loading through factory pattern like other providers - Users can now use esperanto without installing groq package unless they specifically need it