This is where I put all my work in Natural Language Processing
Sequence modeling benchmarks and temporal convolutional networks
Public release of the TransCoder research project https://arxiv.org/pdf/2006.03511.pdf
Sequence to Sequence from Scratch Using Pytorch
Transformer seq2seq model, program that can build a language translator from parallel corpus
Training scripts and instructions how to reproduce our systems submitted to the NEWS 2018 Task on...
Sequence to Sequence Learning with Keras
Meta's "No Language Left Behind" models served as web app and REST API
Visualization for Sequential Neural Networks with Attention
Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.
Transformer models implementation for training from scratch.
An implementation of masked language modeling for Pytorch, made as concise and simple as possible
Easy to use, state-of-the-art Neural Machine Translation for 100+ languages
Explore large language models in 512MB of RAM
Improving Language Model Performance through Smart Vocabularies