期刊文献+

基于BSTTC模型的中文命名实体识别 被引量:3

Chinese Named Entity Recognition Based on BSTTC Model
下载PDF
导出
摘要 大多数中文命名实体识别模型中,语言预处理只关注单个词和字符的向量表示,忽略了它们之间的语义关系,无法解决一词多义问题;Transformer特征抽取模型的并行计算和长距离建模优势提升了许多自然语言理解任务的效果,但全连接结构使得计算复杂度为输入长度的平方,导致其在中文命名实体识别的效果不佳.针对这些问题,提出一种基于BSTTC (BERT-Star-Transformer-TextCNN-CRF)模型的中文命名实体识别方法.首先利用在大规模语料上预训练好的BERT模型根据其输入上下文动态生成字向量序列;然后使用星型Transformer与TextCNN联合模型进一步提取句子特征;最后将特征向量序列输入CRF模型得到最终预测结果.在MSRA中文语料上的实验结果表明,该模型的精确率、召回率和F1值与之前模型相比,均有所提高.与BERT-Transformer-CRF模型相比,训练时间大约节省了65%. In most recognition models of Chinese named entities,language preprocessing only focuses on the vector representation of single words and characters and ignores the semantic relationship between them,hence failing to tackle polysemy.The transformer feature extraction model improves the understanding of natural language due to parallel computing and long-distance modeling,but its fully connected structure makes the computational complexity the square of the input length,which leads to poor recognition of Chinese named entities.A recognition method for Chinese named entities based on the BERT-Star-Transformer-TextCNN-CRF(BSTTC)model is proposed to solve these problems.First,the BERT model pre-trained on a large-scale corpus is used to dynamically generate the word vector sequence according to its input context.Then,the star Transformer-TextCNN model is adopted to further extract sentence features.Finally,the prediction result is received by inputting the feature vector sequence into the CRF model.The experimental results on the Chinese corpus from MSRA show that the accuracy,recall,and F1 value of this model are all higher than those of existing models.Moreover,its training time is 65%shorter than that of the BSTTC model.
作者 申晖 张英俊 谢斌红 赵红燕 SHEN Hui;ZHANG Ying-Jun;XIE Bin-Hong;ZHAO Hong-Yan(School of Computer Science and Technology,Taiyuan University of Science and Technology,Taiyuan 030024,China)
出处 《计算机系统应用》 2021年第6期262-270,共9页 Computer Systems & Applications
基金 山西省重点研发计划重点项目(201703D111027) 山西省重点计划研发项目(201803D121048,201803D121055)。
关键词 BERT 星型Transformer 命名实体识别 TextCNN 条件随机场 BERT Star-Transformer named entity recognition TextCNN Conditional Random Fields(CRF)
  • 相关文献

参考文献10

二级参考文献61

  • 1李妮,关焕梅,杨飘,董文永.基于BERT-IDCNN-CRF的中文命名实体识别方法[J].山东大学学报(理学版),2020,55(1):102-109. 被引量:54
  • 2吴友政,赵军,段湘煜,徐波.问答式检索技术及评测研究综述[J].中文信息学报,2005,19(3):1-13. 被引量:48
  • 3夏天.汉语词语语义相似度计算研究[J].计算机工程,2007,33(6):191-194. 被引量:63
  • 4Sundheim B M. Named entity task definition, version 2.1. In:Proc. of the Sixth Message Understanding Conf. 1995. 319~332
  • 5Borthwick A. A Maximum Entropy Approach to Named Entity Recognition: [Ph. D]. New York University. Department of Computer Science, Courant Institute 1999
  • 6Humphreys K, Gaizauskas R, Azzam S, et al. Description of the LaSIE-Ⅱ system as used for MUC-7. In:Proc. of the 7th Message Understanding Conference (MUC-7), 1998
  • 7URL http://www. ltg. ed. ac. uk
  • 8Chen H H, Ding Y W, Tsai S C,et al. Description of the NTU System Used for MET2. In: Proc. of 7th Message Understanding Conference, 1998
  • 9Black W J, Rinaldi F,Mowatt D. Facile: Description of the NE System Used For MUC-7. In:Proc. of 7th Message Understanding Conf. 1998
  • 10Fukumoto J, Shimohata M, Masui F, Sasaki M. Oki Electric Industry: Description of the Oki System as Used for MET-2. In:Proc. of 7th Message Understanding Conf. 1998

共引文献411

同被引文献18

引证文献3

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部