Alpaca-CoT

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. We welcome open-source enthusiasts to initiate any meaningful PR on this repo and integrate as many LLM related technologies as possible. 我们打造了方便研究人员上手和使用大模型等微调平台,我们欢迎开源爱好者发起任何有意义的pr!

APACHE-2.0 License

Stars
2.6K
Committers
31

Commit Statistics

Past Year

All Time

Total Commits
20
353
Total Committers
13
34
Avg. Commits Per Committer
1.54
10.38
Bot Commits
0
0

Issue Statistics

Past Year

All Time

Total Pull Requests
13
65
Merged Pull Requests
13
64
Total Issues
7
43
Time to Close Issues
7 days
10 days