This repository provides the code and model checkpoints of the research paper: Scalable Pre-training of Large Autoregressive Image Models
OTHER License
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network...
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
🦙 LaMa Image Inpainting, Resolution-robust Large Mask Inpainting with Fourier Convolutions, WACV...
Train high-quality text-to-image diffusion models in a data & compute efficient manner
PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)