spacy-curated-transformers

spaCy entry points for Curated Transformers

MIT License

Downloads
101.3K
Stars
24
Committers
6

πŸ’« πŸ€– spaCy Curated Transformers

This package provides spaCy components and architectures to use a curated set of transformer models via curated-transformers in spaCy.

Features

  • Use pretrained models based on one of the following architectures to
    power your spaCy pipeline:
    • ALBERT
    • BERT
    • CamemBERT
    • RoBERTa
    • XLM-RoBERTa
  • All the nice features supported by spacy-transformers
    such as support for Hugging Face Hub, multi-task learning, the extensible config system and
    out-of-the-box serialization
  • Deep integration into spaCy, which lays the groundwork for deployment-focused features
    such as distillation and quantization
  • Minimal dependencies

⏳ Installation

Installing the package from pip will automatically install all dependencies.

pip install spacy-curated-transformers

πŸš€ Quickstart

An example project is provided in the project directory.

πŸ“– Documentation

Bug reports and other issues

Please use spaCy's issue tracker to report a bug, or open a new thread on the discussion board for any other issue.

Package Rankings
Top 10.95% on Pypi.org
Badges
Extracted from project README's
PyPi GitHub
Related Projects