MSA Transformer reproduction code
Statistics for this project are still being loaded, please check back later.
Home of StarCoder: fine-tuning & inference!
Solution for the Sartorius - Cell Instance Segmentation competition
Visual Attention based OCR
GPT implementation in Flax
Fast and memory-efficient exact attention
Unsupervised Language Modeling at scale for robust sentiment classification
This package contains deep learning models and related scripts for RoseTTAFold
Tensorflow implementation of contextualized word representations from bi-directional language models
(Unofficial) PyTorch implementation of grouped-query attention (GQA) from "GQA: Training Generali...
An implementation of Performer, a linear attention-based transformer, in Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repo...
Simple implementation of FAVOR attention layer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle th...
View model summaries in PyTorch!