What's Changed
* Make backend configurable by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/46
* Bump urllib3 from 2.0.6 to 2.0.7 by dependabot in https://github.com/jackmpcollins/magentic/pull/47
* Replace black with ruff formatter by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/48
* Handle pydantic generic BaseModel in name_type and function schema by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/52
* Allow ChatModel to be set with context manager by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/53
**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.7.2...v0.8.0
---
Allow the chat_model/LLM to be set using a context manager. This allows the same prompt-function to easily be reused with different models, and also makes it neater to dynamically set the model.
python
from magentic import OpenaiChatModel, prompt
prompt("Say hello")
def say_hello() -> str:
...
prompt(
"Say hello",
model=OpenaiChatModel("gpt-4", temperature=1),
)
def say_hello_gpt4() -> str:
...
say_hello() Uses env vars or default settings
with OpenaiChatModel("gpt-3.5-turbo"):
say_hello() Uses gpt-3.5-turbo due to context manager
say_hello_gpt4() Uses gpt-4 with temperature=1 because explicitly configured