torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

MIT License

Downloads
1.6K
Stars
1.4K
Committers
3

Commit Statistics

Past Year

All Time

Total Commits
250
1,124
Total Committers
1
5
Avg. Commits Per Committer
250.0
224.8
Bot Commits
0
0

Issue Statistics

Past Year

All Time

Total Pull Requests
63
175
Merged Pull Requests
62
173
Total Issues
10
21
Time to Close Issues
2 days
about 24 hours