A minimal nmt example to serve as an seq2seq+attention reference.
MIT License
Natural Language Processing Tutorial for Deep Learning Researchers
Pipeline for training Stanford Seq2Seq Neural Machine Translation using PyTorch.
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Transformer: PyTorch Implementation of "Attention Is All You Need"
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Network-to-Network Translation with Conditional Invertible Neural Networks
Sequence to Sequence from Scratch Using Pytorch