Fast-Transformer

An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow

APACHE-2.0 License

Downloads
65
Stars
149
Committers
1

Commit Statistics

Past Year

All Time

Total Commits
57
Total Committers
1
Avg. Commits Per Committer
57.0
Bot Commits
0

Issue Statistics

Past Year

All Time

Total Pull Requests
0
7
Merged Pull Requests
0
7
Total Issues
0
4
Time to Close Issues
N/A
about 14 hours
Package Rankings
Top 15.18% on Pypi.org
Badges
Extracted from project README
Twitter Run Tests Upload Python Package Code style: black Open In Colab DOI GitHub stars GitHub followers Twitter Follow
Related Projects