compositional-attention-pytorch

Implementation of "compositional attention" from MILA, a multi-head attention variant that is reframed as a two-step attention process with disentangled search and retrieval head aggregation, in Pytorch

MIT License

Downloads
35
Stars
50
Committers
1

Commit Statistics

Past Year

All Time

Total Commits
0
8
Total Committers
0
1
Avg. Commits Per Committer
0.0
8.0
Bot Commits
0
0

Issue Statistics

Past Year

All Time

Total Pull Requests
0
0
Merged Pull Requests
0
0
Total Issues
0
0
Time to Close Issues
N/A
N/A
Package Rankings
Top 23.64% on Pypi.org
Related Projects