Gemini support via litellm. See docs https://langroid.github.io/langroid/tutorials/non-openai-llms/
Essentially, you need to: - install `langroid` with the `litellm` extra, e.g. `pip install "langroid[litellm]"` - set up your `GEMINI_API_KEY` in the `.env` file or shell env - specify `chat_model="litellm/gemini/gemini-1.5-pro-latest"`
0.1.240
QdrantDB, QdrantDBConfig: support for sparse embeddings
0.1.239
Change python range to >= 3.10 < 3.12 to avoid colab errors (used to be >= 3.11)
0.1.238
Minor type sig change
0.1.237
Upgrade to latest DuckDuckGoSearch
0.1.236
Groq support. For any model <m> hosted on `groq`, you can use it similar to how you'd use `ollama`, e.g. `chat_model = "groq/llama3-8b-8192"`. See the groq section here: https://langroid.github.io/langroid/tutorials/local-llm-setup/