期刊文献+

基于Word2Vec和改进注意力机制AlexNet-2的文本分类方法 被引量:12

Text Classification Method Based on Word2Vec and AlexNet-2 with Improved Attention Mechanism
下载PDF
导出
摘要 为了提高文本分类的准确性和运行效率,提出一种Word2Vec文本表征和改进注意力机制AlexNet-2的文本分类方法。首先,利用Word2Vec对文本词特征进行嵌入表示,并训练词向量,将文本表示成分布式向量的形式;然后,利用改进的AlexNet-2对长距离词相依性进行有效编码,同时对模型添加注意力机制,以高效学习目标词的上下文嵌入语义,并根据词向量的输入与最终预测结果的相关性,进行词权重的调整。实验在3个公开数据集中进行评估,分析了大量样本标注和少量样本标注的情形。实验结果表明,与已有的优秀方法相比,所提方法可以明显提高文本分类的性能和运行效率。 In order to improve the accuracy and efficiency of text classification,a text classification method based on Word2 Vec text representation and AlexNet-2 with improved attention mechanism is proposed.Firstly,Word2 Vec is adopted to embed the text word features,and the word vector is trained to represent the text in the form of distributed vectors.Then,an improved AlexNet-2 is used to effectively encode the long-distance word dependency.Meanwhile,the attention mechanism is added to the model to learn the contextual embedding semantics of the target word efficiently,and the word weight is adjusted according to the correlation between the input of word vector and the final prediction result.The experiment is evaluated in three public data sets,and the situations of a large number of sample annotations and a small number of sample annotations are analyzed.Experimental results show that,compared with the existing excellent methods,the proposed method can significantly improve the performance and efficiency of text classification.
作者 钟桂凤 庞雄文 隋栋 ZHONG Gui-feng;PANG Xiong-wen;SUI Dong(College of Computer Science&Engineering,Guangzhou Institute of Science and Technology,Guangzhou 510540,China;School of Computer,South China Normal University,Guangzhou 530631,China;School of Electrical and Information Engineering,Beijing University of Civil Engineering and Architecture,Beijing 102406,China)
出处 《计算机科学》 CSCD 北大核心 2022年第4期288-293,共6页 Computer Science
基金 国家自然科学青年基金(61702026) 2020年度广东省高校科研项目(2020GXJK201) 2019年度广东省高校科研项目(2019KTSCX243) 2021年广东省高等教育专项(2021GXJK275)。
关键词 文本分类 注意力机制 AlexNet-2模型 上下文嵌入 词相依性 Text classification Attention mechanism AlexNet-2 model Contextual embedding Word dependency
  • 相关文献

参考文献6

二级参考文献27

共引文献118

同被引文献123

引证文献12

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部