kani

kani (カニ) is a highly hackable microframework for chat-based language models with tool use/function calling. (NLP-OSS @ EMNLP 2023)

MIT License

Downloads
1.3K
Stars
556
Committers
6

Bot releases are hidden (Show)

kani - v0.3.3

Published by zhudotexe about 1 year ago

Improvements

  • Added a warning in Kani.chat_round to use Kani.full_round when AI functions are defined
  • Added examples in Google Colab
  • Other documentation improvements

Fixes

kani - v0.3.2

Published by zhudotexe about 1 year ago

Improvements

  • Made chat_in_terminal work in Google Colab, rather than having to use await chat_in_terminal_async
kani - v0.3.1

Published by zhudotexe about 1 year ago

  • HuggingFace Engine: Fixed an issue where completion message lengths were overreported by an amount equal to the prompt length.
  • Other documentation improvements
kani - v0.3.0

Published by zhudotexe about 1 year ago

Improvements

  • Added Kani.add_to_history, a method that is called whenever kani adds a new message to the chat context
  • httpclient.BaseClient.request now returns a Response to aid low-level implementation
    • .get() and .post() are unchanged
  • Add additional documentation about GPU support for local models
  • Other documentation improvements
kani - v0.2.0

Published by zhudotexe about 1 year ago

Improvements

  • Engines: Added Engine.function_token_reserve() to dynamically reserve a number of tokens for a function list
  • OpenAI: The OpenAIEngine now reads the OPENAI_API_KEY environment variable by default if no api key or client is specified
  • Documentation improvements (polymorphism, mixins, extension packages)
kani - v0.1.0

Published by zhudotexe about 1 year ago

BREAKING CHANGES

These should hopefully be the last set of breaking changes until v1.0. We're finalizing some of the attribute names for clarity and publication.

  • renamed Kani.always_include_messages to Kani.always_included_messages

Features & Improvements

  • @ai_functions with synchronous signatures now run in a thread pool in order to prevent blocking the asyncio event loop
  • OpenAI: Added the ability to specify the API base and additional headers (e.g. for proxy APIs).
  • Various documentation improvements
kani - v0.0.3

Published by zhudotexe about 1 year ago

BREAKING CHANGES

  • Renamed Kani.get_truncated_chat_history to Kani.get_prompt

Additions & Improvements

  • Added CTransformersEngine and LlamaCTransformersEngine (thanks @Maknee!)
  • Added a lower-level Kani.get_model_completion to make a prediction at the current chat state (without modifying the chat history)
  • Added the auto_truncate param to @ai_function to opt in to kani trimming long responses from a function (i.e., responses that do not fit in a model's context)
  • Improved the internal handling of tokens when the chat history is directly modified
  • ChatMessage.[role]() classmethods now pass kwargs to the constructor
  • LLaMA: Improved the fidelity of non-strict-mode LLaMA prompting
  • OpenAI: Added support for specifying an OpenAI organization and configuring retry
  • Many documentation improvements

Fixes

  • OpenAI message length could return too short on messages with no content
  • Other minor fixes and improvements
kani - v0.0.2

Published by zhudotexe over 1 year ago

  • Add chat_in_terminal_async for async environments (e.g. Google Colab)
  • Add quickstart Colab notebook
kani - v0.0.1

Published by zhudotexe over 1 year ago

Initial release!

Package Rankings
Top 9.2% on Pypi.org
Related Projects