GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型
APACHE-2.0 License
TensorFlow code and pre-trained models for BERT
Lingvo
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)