typegpt

Make GPT safe for production

MIT License

Downloads
465
Stars
14
Committers
1

Bot releases are visible (Hide)

typegpt - 0.3.5 Latest Release

Published by alexeichhorn 3 months ago

Support for new and cheaper gpt-4o-2024-08-06 model

typegpt - 0.3.4

Published by alexeichhorn 3 months ago

Fixed decoding issue

typegpt - 0.3.3

Published by alexeichhorn 3 months ago

Full support for gpt-4o-mini

typegpt - 0.3.2

Published by alexeichhorn 5 months ago

  • Support for gpt-4o (including new tokenizer)
  • Fixed maximum allowed tokens of standard gpt-3.5-turbo model (now correctly 16K)
  • LLMException that are thrown contain valuable information inside the object such as compiled system and user prompt
typegpt - 0.3.1

Published by alexeichhorn 6 months ago

Support for openai library version >= 1.20.0

typegpt - 0.3

Published by alexeichhorn 6 months ago

  • Support for easy and typed few-shot prompting. Just implement the few_shot_examples function inside your prompt:
class ExamplePrompt(PromptTemplate):

    class Output(BaseLLMResponse):
        class Ingredient(BaseLLMResponse):
            name: str
            quantity: int

        ingredients: list[Ingredient]

    def system_prompt(self) -> str:
        return "Given a recipe, extract the ingredients."

    def few_shot_examples(self) -> list[FewShotExample[Output]]:
        return [
            FewShotExample(
                input="Let's take two apples, three bananas, and four oranges.",
                output=self.Output(ingredients=[
                    self.Output.Ingredient(name="apple", quantity=2),
                    self.Output.Ingredient(name="banana", quantity=3),
                    self.Output.Ingredient(name="orange", quantity=4),
                ])
            ),
            FewShotExample(
                input="My recipe requires five eggs and two cups of flour.",
                output=self.Output(ingredients=[
                    self.Output.Ingredient(name="egg", quantity=5),
                    self.Output.Ingredient(name="flour cups", quantity=2),
                ])
            )
        ]

    def user_prompt(self) -> str:
        ...
  • You can disable automatically added system instruction that instructs LLM how to format output. Only use this if you use few-shot prompting, a fine-tuned model, or instruct the model yourself (not recommended). Use it like this:
class ExamplePrompt(PromptTemplate):

    settings = PromptSettings(disable_formatting_instructions=True)

    ...
  • Support for new gpt-4-turbo-2024-04-09 / gpt-4-turbo model
typegpt - 0.2.2

Published by alexeichhorn 9 months ago

Support for new gpt-3.5-turbo-0125 model version

typegpt - 0.2.1

Published by alexeichhorn 9 months ago

Fully supports new gpt-4-0125-preview model and new gpt-4-turbo-preview model alias.

typegpt - 0.2

Published by alexeichhorn 9 months ago

You can now nest response types. Note that you need to use BaseLLMArrayElement for classes that you want to nest inside a list. To add instructions inside an element of BaseLLMArrayElement, you must use LLMArrayElementOutput instead of LLMOutput.

class Output(BaseLLMResponse):

    class Item(BaseLLMArrayElement):

        class Description(BaseLLMResponse):
            short: str | None
            long: str

        title: str
        description: Description
        price: float = LLMArrayElementOutput(instruction=lambda pos: f"The price of the {pos.ordinal} item")

    items: list[Item]
    count: int
typegpt - Bug fix

Published by alexeichhorn 11 months ago

Fixed issue where TypeOpenAI were not importable

typegpt - Initial release

Published by alexeichhorn 12 months ago

typegpt - 0.0.1

Published by alexeichhorn 12 months ago

Release for legacy support for openai package < 1.0

Package Rankings
Top 34.89% on Pypi.org
Related Projects