RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
APACHE-2.0 License
Bot releases are visible (Hide)
stable release for rwkv v5
Published by BlinkDL almost 2 years ago
Just a stable release.
Published by BlinkDL over 2 years ago
Attached model : ctx1024-layer6-emb512 on enwik8 with 1.65 dev perplexity (0.72 BPC)
Published by BlinkDL about 3 years ago
v0.02 with RWKV, MHA_shift, MHA_rotary, MHA_pro, and time-shift mixing.
Published by BlinkDL about 3 years ago
first release with RWKV, MHA_rotary, MHA_pro, and time-shift mixing.