This release supports schema input/output, courtesy of OpenAI's new function calling feature!
Schema
py3
from pydantic import BaseModel, Field
ai = AIChat(
console=False,
save_messages=False, with schema I/O, messages are never saved
model="gpt-3.5-turbo-0613",
params={"temperature": 0.0},
)
class get_event_metadata(BaseModel):
"""Event information"""
description: str = Field(description="Description of event")
city: str = Field(description="City where event occured")
year: int = Field(description="Year when event occured")
month: str = Field(description="Month when event occured")
returns a dict, with keys ordered as in the schema
ai("First iPhone announcement", output_schema=get_event_metadata)
txt
{'description': 'The first iPhone was announced by Apple Inc.',
'city': 'San Francisco',
'year': 2007,
'month': 'January'}
There's a lot you can do with it, and this implementation is just the beginning.
Notes:
- In all cases, no messages are saved when using schema to prevent unintended behavior. You will have to manage the intermediate output yourself for the time being, if you want to chain inputs.
- Input and Output schema works for normal generation, both sync and async.
- For streaming, only input schema will work, as streaming output structured data would not play nice.
- Neither is currently supported for working with tools; that still needs further testing.
Other changes:
- A session context manager was added (7) for cases where you want to script a temporary conversation, in both sync and async flavors.
- The `finish_reason` is saved from ChatGPT outputs (14)
- You can now pass an `output_path` to `save_session()`.