期刊文献+

面向机器阅读理解的多任务层次微调模型 被引量:2

Multi-task Hierarchical Fine-tuning Model Toward Machine Reading Comprehension
下载PDF
导出
摘要 机器阅读理解与问答一直以来被认为是自然语言理解的核心问题之一,要求模型通过给定的文章与问题去挑选出最佳答案.随着BERT等预训练模型的兴起,众多的自然语言处理任务取得了重大突破,然而在复杂的阅读理解任务方面仍然存在一些不足,针对该任务,提出了一个基于回顾式阅读器的机器阅读理解模型.模型使用RoBERTa预训练模型对问题与文章进行编码,并将阅读理解部分分为词级别的精读模块与句子级别的泛读模块两个模块.这两个模块以两种不同的粒度来获取文章和问题的语义信息,最终结合两个模块的预测答案合并输出.该模型在CAIL2020的数据集上综合F1值达到了66.15%,相较于RoBERTa模型提升了5.38%,并通过消融实验证明了本模型的有效性. Machine reading comprehension and question answering has long been considered as one of the core problems of natural language understanding, which requires models to select the best answer from a given text and question. With the rise of pre-trained language models such as BERT, great breakthroughs have been made in natural language processing(NLP) tasks. However, there are still some shortcomings in complex reading comprehension tasks. To solve this problem, this study proposes a machine reading comprehension model based on retrospective readers. The proposed model uses the pre-trained model RoBERTa to encode questions and articles and divides the reading comprehension section into two modules: an intensive reading module at the word level and a comprehensive reading module at the sentence level. These two modules capture the semantic information in articles and problems at two different granularity levels. Finally, the prediction results of the two modules are combined to produce the answer with the highest probability.The model accuracy is improved in the CAIL2020 dataset and the Joint_F1 value of the model reaches 66.15%, which is5.38% higher than that of the RoBERTa model. The effectiveness of this model is proved by ablation experiments.
作者 丁美荣 刘鸿业 徐马一 龚思雨 陈晓敏 曾碧卿 DING Mei-Rong;LIU Hong-Ye;XU Ma-Yi;GONG Si-Yu;CHEN Xiao-Min;ZENG Bi-Qing(School of Software,South China Normal University,Foshan 528225,China)
出处 《计算机系统应用》 2022年第3期212-219,共8页 Computer Systems & Applications
基金 国家自然科学基金(61876067) 广东省普通高校人工智能重点领域专项(2019KZDZX1033) 广东省信息物理融合系统重点实验室建设专项(2020B1212060069)。
关键词 自然语言处理 机器阅读理解 多任务学习 预训练语言模型 层次微调 natural language processing machine reading comprehension multi-task learning pre-trained language model hierarchical fine-tuning
  • 相关文献

参考文献3

二级参考文献3

共引文献21

同被引文献6

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部