MASS: Masked Sequence to Sequence Pre-training for Language Generation
OTHER License
Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
This repository contains resources for accessing the official benchmarks, codes, and checkpoints ...
An efficient implementation of the popular sequence models for text generation, summarization, an...