Magentic

Latest version: v0.32.0

Safety actively analyzes 682416 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 10

0.11.0

What's Changed
* Add support for Azure via OpenaiChatModel by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/65


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.10.0...v0.11.0

0.10.0

What's Changed
* Update openai to v1 by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/59


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.9.1...v0.10.0

0.9.1

[Bound openai version <1.0](https://github.com/jackmpcollins/magentic/commit/601ed483586b4112c2fcc5b4f1ec8bdd4301fc24)

**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.9.0...v0.9.1

0.9.0

What's Changed
* Add LiteLLM backend by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/54


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.8.0...v0.9.0

---

Example of LiteLLM backend

python
from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel


prompt(
"Talk to me! ",
model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
...


say_hello()


See the [Backend/LLM Configuration section of the README](https://github.com/jackmpcollins/magentic/tree/5b4b1012e630bd01baa0ef8bf3caf86fb68ae993#backendllm-configuration) for how to set the LiteLLM backend as the default.

0.8.0

What's Changed
* Make backend configurable by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/46
* Bump urllib3 from 2.0.6 to 2.0.7 by dependabot in https://github.com/jackmpcollins/magentic/pull/47
* Replace black with ruff formatter by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/48
* Handle pydantic generic BaseModel in name_type and function schema by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/52
* Allow ChatModel to be set with context manager by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/53


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.7.2...v0.8.0

---

Allow the chat_model/LLM to be set using a context manager. This allows the same prompt-function to easily be reused with different models, and also makes it neater to dynamically set the model.


python
from magentic import OpenaiChatModel, prompt


prompt("Say hello")
def say_hello() -> str:
...


prompt(
"Say hello",
model=OpenaiChatModel("gpt-4", temperature=1),
)
def say_hello_gpt4() -> str:
...


say_hello() Uses env vars or default settings

with OpenaiChatModel("gpt-3.5-turbo"):
say_hello() Uses gpt-3.5-turbo due to context manager
say_hello_gpt4() Uses gpt-4 with temperature=1 because explicitly configured

0.7.2

What's Changed
* Allow setting max_tokens param by jackmpcollins in https://github.com/jackmpcollins/magentic/pull/45


**Full Changelog**: https://github.com/jackmpcollins/magentic/compare/v0.7.1...v0.7.2

---

Allow setting `max_tokens` param in `OpenaiChatModel`. The default value for this can also be set using an environment variable `MAGENTIC_OPENAI_MAX_TOKENS`.

Example

python
from magentic import prompt
from magentic.chat_model.openai_chat_model import OpenaiChatModel

prompt("Hello, how are you?", model=OpenaiChatModel(max_tokens=3))
def test() -> str:
...

test()
'Hello! I'

Page 7 of 10

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.