Local Attention - Flax module for Jax
MIT License
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Vi...
GPT implementation in Flax
Jax, Flax, examples (ImageClassification, SemanticSegmentation, and more...)
Implementation of N-Grammer in Flax
A simple cross attention that updates both the source and target in one step
A GPT, made only of MLPs, in Jax
Implementation of Flash Attention in Jax
Some personal experiments around routing tokens to different autoregressive attention, akin to mi...
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizin...
Flax is a neural network library for JAX that is designed for flexibility.
The ✨Magical✨ JAX ML Library.
Minimal JAX/Flax port of `lpips` supporting `vgg16`, with pre-trained weights stored in the 🤗 Hug...
General Modules for JAX
Implementation of Nyström Self-attention, from the paper Nyströmformer
Simple implementation of FAVOR attention layer