- Support for easy and typed few-shot prompting. Just implement the `few_shot_examples` function inside your prompt:
python
class ExamplePrompt(PromptTemplate):
class Output(BaseLLMResponse):
class Ingredient(BaseLLMResponse):
name: str
quantity: int
ingredients: list[Ingredient]
def system_prompt(self) -> str:
return "Given a recipe, extract the ingredients."
def few_shot_examples(self) -> list[FewShotExample[Output]]:
return [
FewShotExample(
input="Let's take two apples, three bananas, and four oranges.",
output=self.Output(ingredients=[
self.Output.Ingredient(name="apple", quantity=2),
self.Output.Ingredient(name="banana", quantity=3),
self.Output.Ingredient(name="orange", quantity=4),
])
),
FewShotExample(
input="My recipe requires five eggs and two cups of flour.",
output=self.Output(ingredients=[
self.Output.Ingredient(name="egg", quantity=5),
self.Output.Ingredient(name="flour cups", quantity=2),
])
)
]
def user_prompt(self) -> str:
...
- You can disable automatically added system instruction that instructs LLM how to format output. Only use this if you use few-shot prompting, a fine-tuned model, or instruct the model yourself (not recommended). Use it like this:
python
class ExamplePrompt(PromptTemplate):
settings = PromptSettings(disable_formatting_instructions=True)
...
- Support for new `gpt-4-turbo-2024-04-09` / `gpt-4-turbo` model