This contains the Flax model of min(DALL·E) and code for converting it to PyTorch
MIT License
Statistics for this project are still being loaded, please check back later.
Flax is a neural network library for JAX that is designed for flexibility.
Elegant PyTorch implementation of paper Model-Agnostic Meta-Learning (MAML)
LL3M: Large Language and Multi-Modal Model in Jax
Open weights LLM from Google DeepMind.
Graphic User Interface for Deep learning to make it used easily,让深度学习算法触手可及、一键调用
pix2tex: Using a ViT to convert images of equations into LaTeX code.
Explore large language models in 512MB of RAM
Convert ONNX models to plain C/ASM code
Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners
Minimal JAX/Flax port of `lpips` supporting `vgg16`, with pre-trained weights stored in the 🤗 Hug...
GPT implementation in Flax
LLaMA: Open and Efficient Foundation Language Models
Implementation of Open-Set Likelihood Maximization for Few-Shot Learning