TUPE

Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.

MIT License

Stars
249

Issue Statistics

Past Year

All Time

Total Pull Requests
0
3
Merged Pull Requests
0
2
Total Issues
0
19
Time to Close Issues
N/A
5 days
Related Projects