State-of-the-art Machine Learning for the web. Run ๐ค Transformers directly in your browser, with no need for a server!
APACHE-2.0 License
Bot releases are hidden (Show)
Published by xenova over 1 year ago
You can now perform feature extraction on models other than sentence-transformers! All you need to do is target a repo (and/or revision) that was exported with --task default
. Also be sure to use the correct quantization for your use-case!
Example: Run feature extraction with bert-base-uncased (without pooling/normalization).
let extractor = await pipeline('feature-extraction', 'Xenova/bert-base-uncased', { revision: 'default' });
let result = await extractor('This is a simple test.');
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.05939924716949463, 0.021655935794115067, ...],
// dims: [1, 8, 768]
// }
Example: Run feature extraction with bert-base-uncased (with pooling/normalization).
let extractor = await pipeline('feature-extraction', 'Xenova/bert-base-uncased', { revision: 'default' });
let result = await extractor('This is a simple test.', { pooling: 'mean', normalize: true });
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.03373778983950615, -0.010106077417731285, ...],
// dims: [1, 768]
// }
Example: Calculating embeddings with sentence-transformers models.
let extractor = await pipeline('feature-extraction', 'Xenova/all-MiniLM-L6-v2');
let result = await extractor('This is a simple test.', { pooling: 'mean', normalize: true });
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.09094982594251633, -0.014774246141314507, ...],
// dims: [1, 384]
// }
This also means you can do things like semantic search directly in JavaScript/Typescript! Check out the Pinecone docs for an example app which uses Transformers.js!
We now have 109 models to choose from! Check them out at https://huggingface.co/models?other=transformers.js! If you'd like to contribute models (exported with Optimum), you can tag them with library_name: "transformers.js"
! Let's make ML more web-friendly!
Full Changelog: https://github.com/xenova/transformers.js/compare/2.0.2...2.1.0
Published by xenova over 1 year ago
Fixes issues stemming from ORT's recent release of a buggy version 1.15.0 ๐ (https://www.npmjs.com/package/onnxruntime-web)
Also freezes examples and updates links to use the latest stable wasm files.
Published by xenova over 1 year ago
Published by xenova over 1 year ago
It's finally here! ๐ฅ
Run Hugging Face transformers directly in your browser, with no need for a server!
GitHub: https://github.com/xenova/transformers.js
Demo site: https://xenova.github.io/transformers.js/
Documentation: https://huggingface.co/docs/transformers.js
๐ ๏ธ Complete ES6 rewrite
๐ Documentation and examples
๐ค Improved Hugging Face Hub integration
๐ฅ๏ธ Server-side model caching (in Node.js)
๐งช Improved testing framework w/ Jest
โ๏ธ CI/CD with GitHub actions
Published by xenova over 1 year ago
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.3 with various improvements, including:
Published by xenova over 1 year ago
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.2 but with added allowLocalModels
setting and improved handling of errors (e.g., CORS errors).
Published by xenova over 1 year ago
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.1 but with updated jsdelivr entry point in package.json
Published by xenova over 1 year ago
Same as https://github.com/xenova/transformers.js/releases/tag/2.0.0-alpha.0 but with CDN-specific entry points in package.json
Published by xenova over 1 year ago
pipeline
, AutoModel
, AutoTokenizer
, and AutoProcessor
.)