method: great2020-06-05

Authors: tao yue

Affiliation: SHU

Description: bert+position+bilstm+crf

《BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding》