bert_seq2seq的DDP版本,支持bert、roberta、nezha、t5、gpt2等模型,支持seq2seq、ner、关系抽取等任务,无需添加额外代码,轻松启动DDP多卡训练。
APACHE-2.0 License
Python library for automatic training, optimization and comparison of Transformer models on most ...
🌈 NERpy: Implementation of Named Entity Recognition using Python. 命名实体识别工具,支持BertSoftmax、BertSpa...
A repository for training transformer based models
pytextclassifier is a toolkit for text classification. 文本分类,LR,Xgboost,TextCNN,FastText,TextRNN,B...
Pytorch-Named-Entity-Recognition-with-BERT
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
中文nlp解决方案(大模型、数据、模型、训练、推理)
Google AI 2018 BERT pytorch implementation
This repository contains demos I made with the Transformers library by HuggingFace.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Pytorch❤️ Keras 😋😋