Typegpt

Latest version: v0.3.1

Safety actively analyzes 625786 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.20.0

0.3.1

0.3

- Support for easy and typed few-shot prompting. Just implement the `few_shot_examples` function inside your prompt:
python
class ExamplePrompt(PromptTemplate):

class Output(BaseLLMResponse):
class Ingredient(BaseLLMResponse):
name: str
quantity: int

ingredients: list[Ingredient]

def system_prompt(self) -> str:
return "Given a recipe, extract the ingredients."

def few_shot_examples(self) -> list[FewShotExample[Output]]:
return [
FewShotExample(
input="Let's take two apples, three bananas, and four oranges.",
output=self.Output(ingredients=[
self.Output.Ingredient(name="apple", quantity=2),
self.Output.Ingredient(name="banana", quantity=3),
self.Output.Ingredient(name="orange", quantity=4),
])
),
FewShotExample(
input="My recipe requires five eggs and two cups of flour.",
output=self.Output(ingredients=[
self.Output.Ingredient(name="egg", quantity=5),
self.Output.Ingredient(name="flour cups", quantity=2),
])
)
]

def user_prompt(self) -> str:
...


- You can disable automatically added system instruction that instructs LLM how to format output. Only use this if you use few-shot prompting, a fine-tuned model, or instruct the model yourself (not recommended). Use it like this:
python
class ExamplePrompt(PromptTemplate):

settings = PromptSettings(disable_formatting_instructions=True)

...

- Support for new `gpt-4-turbo-2024-04-09` / `gpt-4-turbo` model

0.2.2

Support for new `gpt-3.5-turbo-0125` model version

0.2.1

Fully supports new `gpt-4-0125-preview` model and new `gpt-4-turbo-preview` model alias.

0.2

You can now nest response types. Note that you need to use `BaseLLMArrayElement` for classes that you want to nest inside a list. To add instructions inside an element of `BaseLLMArrayElement`, you must use `LLMArrayElementOutput` instead of `LLMOutput`.

python
class Output(BaseLLMResponse):

class Item(BaseLLMArrayElement):

class Description(BaseLLMResponse):
short: str | None
long: str

title: str
description: Description
price: float = LLMArrayElementOutput(instruction=lambda pos: f"The price of the {pos.ordinal} item")

items: list[Item]
count: int

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.