Magentic

Latest version: v0.39.2

Safety actively analyzes 723650 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 12

0.37.0

What's Changed

The `prompt_chain` decorator can now accept a sequence of `Message` as input, like `chatprompt`.

python
from magentic import prompt_chain, UserMessage

def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
return {"temperature": "72", "forecast": ["sunny", "windy"]}

prompt_chain(
template=[UserMessage("What's the weather like in {city}?")],
functions=[get_current_weather],
)
def describe_weather(city: str) -> str: ...

describe_weather("Boston")
'The weather in Boston is currently 72°F with sunny and windy conditions.'


PRs
* Allow Messages as input to prompt_chain by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/403


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.36.0...v0.37.0

0.36.0

What's Changed

Document the `Chat` class and make it importable from the top level.
docs: https://magentic.dev/chat/

python
from magentic import Chat, OpenaiChatModel, UserMessage

Create a new Chat instance
chat = Chat(
messages=[UserMessage("Say hello")],
model=OpenaiChatModel("gpt-4o"),
)

Append a new user message
chat = chat.add_user_message("Actually, say goodbye!")
print(chat.messages)
[UserMessage('Say hello'), UserMessage('Actually, say goodbye!')]

Submit the chat to the LLM to get a response
chat = chat.submit()
print(chat.last_message.content)
'Hello! Just kidding—goodbye!'


PRs
* Use public import for ChatCompletionStreamState by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/398
* Make Chat class public and add docs by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/401
* Remove unused content None from openai messages by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/402


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.35.0...v0.36.0

0.35.0

What's Changed

`UserMessage` now accepts image urls, image bytes, and document bytes directly using the `ImageUrl`, `ImageBytes`, and `DocumentBytes` types.

Example of new `UserMessage` syntax and `DocumentBytes`

python
from pathlib import Path

from magentic import chatprompt, DocumentBytes, Placeholder, UserMessage
from magentic.chat_model.anthropic_chat_model import AnthropicChatModel


chatprompt(
UserMessage(
[
"Repeat the contents of this document.",
Placeholder(DocumentBytes, "document_bytes"),
]
),
model=AnthropicChatModel("claude-3-5-sonnet-20241022"),
)
def read_document(document_bytes: bytes) -> str: ...


document_bytes = Path("...").read_bytes()
read_document(document_bytes)
'This is a test PDF.'


PRs
* Accept Sequence[Message] instead of list for Chat by alexchandel in https://github.com/jackmpcollins/magentic/pull/390
* Bump astral-sh/setup-uv from 4 to 5 by dependabot in https://github.com/jackmpcollins/magentic/pull/393
* Support images directly in UserMessage by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/387
* Add DocumentBytes for submitting PDF documents by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/395

New Contributors
* alexchandel made their first contribution in https://github.com/jackmpcollins/magentic/pull/390

**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.34.1...v0.35.0

0.34.1

What's Changed
* Consume LLM output stream via returned objects to allow caching by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/384
* Improve ruff format/lint rules by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/385
* Update overview and configuration docs by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/386


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.34.0...v0.34.1

0.34.0

What's Changed

Add `StreamedResponse` and `AsyncStreamedResponse` to enable parsing responses that contain both text _and_ tool calls. See PR https://github.com/jackmpcollins/magentic/pull/383 or the new docs (copied below) https://magentic.dev/streaming/#StreamedResponse for more details.

⚡ StreamedResponse

Some LLMs have the ability to generate text output and make tool calls in the same response. This allows them to perform chain-of-thought reasoning or provide additional context to the user. In magentic, the `StreamedResponse` (or `AsyncStreamedResponse`) class can be used to request this type of output. This object is an iterable of `StreamedStr` (or `AsyncStreamedStr`) and `FunctionCall` instances.

!!! warning "Consuming StreamedStr"

The StreamedStr object must be iterated over before the next item in the `StreamedResponse` is processed, otherwise the string output will be lost. This is because the `StreamedResponse` and `StreamedStr` share the same underlying generator, so advancing the `StreamedResponse` iterator skips over the `StreamedStr` items. The `StreamedStr` object has internal caching so after iterating over it once the chunks will remain available.

In the example below, we request that the LLM generates a greeting and then calls a function to get the weather for two cities. The `StreamedResponse` object is then iterated over to print the output, and the `StreamedStr` and `FunctionCall` items are processed separately.

python
from magentic import prompt, FunctionCall, StreamedResponse, StreamedStr


def get_weather(city: str) -> str:
return f"The weather in {city} is 20°C."


prompt(
"Say hello, then get the weather for: {cities}",
functions=[get_weather],
)
def describe_weather(cities: list[str]) -> StreamedResponse: ...


response = describe_weather(["Cape Town", "San Francisco"])
for item in response:
if isinstance(item, StreamedStr):
for chunk in item:
print the chunks as they are received
print(chunk, sep="", end="")
print()
if isinstance(item, FunctionCall):
print the function call, then call it and print the result
print(item)
print(item())

Hello! I'll get the weather for Cape Town and San Francisco for you.
FunctionCall(<function get_weather at 0x1109825c0>, 'Cape Town')
The weather in Cape Town is 20°C.
FunctionCall(<function get_weather at 0x1109825c0>, 'San Francisco')
The weather in San Francisco is 20°C.



PRs

* Test Ollama via `OpenaiChatModel` by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/281
* Rename test to test_openai_chat_model_acomplete_ollama by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/381
* Add `(Async)StreamedResponse` for multi-part responses by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/383


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.33.0...v0.34.0

0.33.0

What's Changed

> [!WARNING]
> Breaking change: The prompt-function return type and the `output_types` argument to `ChatModel` must now contain `FunctionCall` or `(Async)ParallelFunctionCall` if these return types are desired. Previously instances of these types could be returned even if they were not indicated in the output types.

- Dependency updates
- Improve development workflows
- Big internal refactor to prepare for future features. See PR https://github.com/jackmpcollins/magentic/pull/380 for details.

PRs

* Bump logfire-api from 0.49.0 to 0.52.0 by dependabot in https://github.com/jackmpcollins/magentic/pull/327
* Bump litellm from 1.41.21 to 1.44.27 by dependabot in https://github.com/jackmpcollins/magentic/pull/330
* Bump jupyterlab from 4.2.3 to 4.2.5 by dependabot in https://github.com/jackmpcollins/magentic/pull/322
* Bump anthropic from 0.31.0 to 0.34.2 by dependabot in https://github.com/jackmpcollins/magentic/pull/328
* Bump pydantic-settings from 2.3.4 to 2.5.2 by dependabot in https://github.com/jackmpcollins/magentic/pull/332
* Bump notebook from 7.2.1 to 7.2.2 by dependabot in https://github.com/jackmpcollins/magentic/pull/333
* Bump ruff from 0.5.2 to 0.6.5 by dependabot in https://github.com/jackmpcollins/magentic/pull/331
* Bump jupyter from 1.0.0 to 1.1.1 by dependabot in https://github.com/jackmpcollins/magentic/pull/335
* Bump logfire-api from 0.52.0 to 0.53.0 by dependabot in https://github.com/jackmpcollins/magentic/pull/336
* Bump mkdocs-jupyter from 0.24.8 to 0.25.0 by dependabot in https://github.com/jackmpcollins/magentic/pull/338
* Bump pytest-asyncio from 0.23.7 to 0.24.0 by dependabot in https://github.com/jackmpcollins/magentic/pull/337
* Update precommit hooks by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/339
* Switch to uv from poetry by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/373
* Bump astral-sh/setup-uv from 2 to 3 by dependabot in https://github.com/jackmpcollins/magentic/pull/374
* Use VCR for tests by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/375
* Add CONTRIBUTING.md by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/376
* Make VCR match on request body in tests by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/377
* Add make help command by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/378
* Bump astral-sh/setup-uv from 3 to 4 by dependabot in https://github.com/jackmpcollins/magentic/pull/379
* Refactor to reuse stream parsing across ChatModels by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/380


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.32.0...v0.33.0

Page 3 of 12

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.