Reimplementation of the paper "Attention, Learn to Solve Routing Problems!" in jax/flax.
MIT License
Statistics for this project are still being loaded, please check back later.
Implementation of Flash Attention in Jax
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
Frechet Inception Distance in JAX
JAX port of Persistent Independent Particles
JAX implementation of VQGAN
JAX implementation ViT-VQGAN
GPT implementation in Flax
Pytorch implementation of the reknowned "Attention Is All You Need" paper - NeurIPS 2017
Rigid transforms + Lie groups in JAX
Minimal JAX/Flax port of `lpips` supporting `vgg16`, with pre-trained weights stored in the 🤗 Hug...
Implementation of N-Grammer in Flax
Unofficial implementation of Tensorial Radiance Fields (Chen & Xu ‘22)
Flax is a neural network library for JAX that is designed for flexibility.
A GPT, made only of MLPs, in Jax