Bot releases are hidden (Show)
Support for new and cheaper gpt-4o-2024-08-06
model
Published by alexeichhorn 3 months ago
Fixed decoding issue
Published by alexeichhorn 3 months ago
Full support for gpt-4o-mini
Published by alexeichhorn 5 months ago
gpt-3.5-turbo
model (now correctly 16K)LLMException
that are thrown contain valuable information inside the object such as compiled system and user promptPublished by alexeichhorn 6 months ago
Support for openai library version >= 1.20.0
Published by alexeichhorn 6 months ago
few_shot_examples
function inside your prompt:class ExamplePrompt(PromptTemplate):
class Output(BaseLLMResponse):
class Ingredient(BaseLLMResponse):
name: str
quantity: int
ingredients: list[Ingredient]
def system_prompt(self) -> str:
return "Given a recipe, extract the ingredients."
def few_shot_examples(self) -> list[FewShotExample[Output]]:
return [
FewShotExample(
input="Let's take two apples, three bananas, and four oranges.",
output=self.Output(ingredients=[
self.Output.Ingredient(name="apple", quantity=2),
self.Output.Ingredient(name="banana", quantity=3),
self.Output.Ingredient(name="orange", quantity=4),
])
),
FewShotExample(
input="My recipe requires five eggs and two cups of flour.",
output=self.Output(ingredients=[
self.Output.Ingredient(name="egg", quantity=5),
self.Output.Ingredient(name="flour cups", quantity=2),
])
)
]
def user_prompt(self) -> str:
...
class ExamplePrompt(PromptTemplate):
settings = PromptSettings(disable_formatting_instructions=True)
...
gpt-4-turbo-2024-04-09
/ gpt-4-turbo
modelPublished by alexeichhorn 9 months ago
Support for new gpt-3.5-turbo-0125
model version
Published by alexeichhorn 9 months ago
Fully supports new gpt-4-0125-preview
model and new gpt-4-turbo-preview
model alias.
Published by alexeichhorn 9 months ago
You can now nest response types. Note that you need to use BaseLLMArrayElement
for classes that you want to nest inside a list. To add instructions inside an element of BaseLLMArrayElement
, you must use LLMArrayElementOutput
instead of LLMOutput
.
class Output(BaseLLMResponse):
class Item(BaseLLMArrayElement):
class Description(BaseLLMResponse):
short: str | None
long: str
title: str
description: Description
price: float = LLMArrayElementOutput(instruction=lambda pos: f"The price of the {pos.ordinal} item")
items: list[Item]
count: int
Published by alexeichhorn 11 months ago
Fixed issue where TypeOpenAI
were not importable
Published by alexeichhorn 12 months ago
Published by alexeichhorn 12 months ago
Release for legacy support for openai package < 1.0