Upgrade LiteLLM -> 1.30.1 for Claude3 Support
Use like this (this is a 1-shot example, generalize from here):
First install the `litellm` extra, e.g. `pip install "langroid[litellm]"`.
Ensure you have `ANTHROPIC_API_KEY` set in your environment (e.g. in the `.env` file or via an env var setting).
Then specify `chat_model` as `litellm/<model_name>`:
import langroid as lr
import langroid.language_models as lm
llm_config = lm.OpenAIGPTConfig(
chat_model="litellm/claude-3-sonnet-20240229",
chat_context_length=16_000, adjust based on model
)
llm = lm.OpenAIGPT(llm_config)
llm.chat("What is 4+5?")
agent_config = lr.ChatAgentConfig(llm=llm_config)
agent = lr.ChatAgent(agent_config)
agent.llm_response("When was Beethoven born?")
agent.llm_response("And Chopin?")
task = lr.Task(agent, interactive=True)
task.run()
See the litellm docs for syntax to use for other Claude model names:
https://docs.litellm.ai/docs/providers/anthropic
Note that for many of the example scripts there is a `-m <model_name>` option you can pass, to override the default `GPT4Turbo`. For claude models you can specify `-m litellm/claude-3-sonnet-20240229`