Pipeline for training Stanford Seq2Seq Neural Machine Translation using PyTorch.
Statistics for this project are still being loaded, please check back later.
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Pytorch implementations of various Deep NLP models in cs-224n(Stanford Univ)
PyTorch 官方中文教程包含 60 分钟快速入门教程,强化教程,计算机视觉,自然语言处理,生成对抗网络,强化学习。欢迎 Star,Fork!
NLP research experiments, built on PyTorch within the AllenNLP framework.
Sequence to Sequence from Scratch Using Pytorch
Natural Language Processing Tutorial for Deep Learning Researchers
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Pipeline for training Language Models using PyTorch.
Pipeline for training NER models using PyTorch.
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
Models, data loaders and abstractions for language processing, powered by PyTorch
A minimal nmt example to serve as an seq2seq+attention reference.
Different Deep Learning tasks solved using PyTorch models.
Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Proc...
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.