The fastest JavaScript BPE Tokenizer Encoder Decoder for OpenAI's GPT-2 / GPT-3 / GPT-4 / GPT-4o. Port of OpenAI's tiktoken with additional features.
MIT License
📐 GPT token estimation and context size utilities without a full tokenizer
A test suite comparing Node.js BPE tokenizers for use with AI models.
@msgpack/msgpack - MessagePack for JavaScript / msgpack.org[JavaScript/TypeScript/ECMA-262]
Fast uint8array to utf-8 codepoint iterator for streams and array buffers by @okikio & @jonathant...