pip install bert-multitask-learning
This a project that uses BERT to do multi-task learning with multiple GPU support.
In the original BERT code, neither multi-task learning or multiple GPU training is possible. Plus, the original purpose of this project is NER which dose not have a working script in the original BERT code.
To sum up, compared to the original bert repo, this repo has the following features:
- Multi-task learning(major reason of re-writing the majority of code).
- Multiple GPU training
- Support sequence labeling (for example, NER) and Encoder-Decoder Seq2Seq(with transformer decoder).
- Masked LM and next sentence prediction Pre-train(pretrain)
- Classification(cls)
- Sequence Labeling(seq_tag)
- Seq2seq Labeling(seq2seq_tag)
- Seq2seq Text Generation(seq2seq_text)
- Multi-Label Classification(multi_cls)
There are two types of chaining operations can be used to chain problems.
&
. If two problems have the same inputs, they can be chained using&
. Problems chained by&
will be trained at the same time.|
. If two problems don't have the same inputs, they need to be chained using|
. Problems chained by|
will be sampled to train at every instance.
For example, cws|NER|weibo_ner&weibo_cws
, one problem will be sampled at each turn, say weibo_ner&weibo_cws
, then weibo_ner
and weibo_cws
will trained for this turn together. Therefore, in a particular batch, some tasks might not be sampled, and their loss could be 0 in this batch.
Please see the examples in notebooks for more details about training, evaluation and export models.
pip install bert-multitask-learning
这是利用BERT进行多任务学习并且支持多GPU训练的项目.
在原始的BERT代码中, 是没有办法直接用多GPU进行多任务学习的. 另外, BERT并没有给出序列标注和Seq2seq的训练代码.
因此, 和原来的BERT相比, 这个项目具有以下特点:
- 多任务学习
- 多GPU训练
- 序列标注以及Encoder-decoder seq2seq的支持(用transformer decoder)
- Masked LM和next sentence prediction预训练(pretrain)
- 单标签分类(cls)
- 序列标注(seq_tag)
- 序列到序列标签标注(seq2seq_tag)
- 序列到序列文本生成(seq2seq_text)
- 多标签分类(multi_cls)
- 中文命名实体识别
- 中文分词
- 中文词性标注
可以用两种方法来将多个任务连接起来.
&
. 如果两个任务有相同的输入, 不同标签的话, 那么他们可以用&
来连接. 被&
连接起来的任务会被同时训练.|
. 如果两个任务为不同的输入, 那么他们必须用|
来连接. 被|
连接起来的任务会被随机抽取来训练.
例如, 我们定义任务cws|NER|weibo_ner&weibo_cws
, 那么在生成每一条数据时, 一个任务块会被随机抽取出来, 例如在这一次抽样中, weibo_ner&weibo_cws
被选中. 那么这次weibo_ner
和weibo_cws
会被同时训练. 因此, 在一个batch中, 有可能某些任务没有被抽中, loss为0.
训练, eval和导出模型请见notebooks