RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

APACHE-2.0 License

Stars
12.5K
Committers
5

Bot releases are visible (Hide)

RWKV-LM - RWKV-v5 Latest Release

Published by BlinkDL 10 months ago

stable release for rwkv v5

RWKV-LM - RWKV-v4neo

Published by BlinkDL almost 2 years ago

Just a stable release.

RWKV-LM - RWKV v2 - RNN with Transformer Performance

Published by BlinkDL over 2 years ago

Attached model : ctx1024-layer6-emb512 on enwik8 with 1.65 dev perplexity (0.72 BPC)

RWKV-LM - 0.02

Published by BlinkDL about 3 years ago

v0.02 with RWKV, MHA_shift, MHA_rotary, MHA_pro, and time-shift mixing.

RWKV-LM - 0.01

Published by BlinkDL about 3 years ago

first release with RWKV, MHA_rotary, MHA_pro, and time-shift mixing.