摘要
本文针对深度神经网络如何更快速和充分学习的问题,提出一种基于知识传递的深度交流学习(Deep Communication Learning,DCL)模式。该模式中多个神经网络在各自独立学习的同时将网络参数作为知识进行交流,单个神经网络在训练中将自身所学到的知识分享给其他网络,同时从其他网络上吸纳一定比例的学习成果,交替进行独自学习和在集体中的知识交流。基于多个公开数据集的实验结果表明,相对于单独学习,仅用2个网络进行DCL就可获得学习效果最高3.44%的提升;增加进行DCL的网络个数至6个,学习效果可进一步得到最高2.74%的提升。DCL模式有利于训练出效果更好的神经网络。
The paper proposes a Deep Communication Learning(DCL)pattern based on knowledge transfer to deal with the problem of how deep neural networks can learn more quickly and adequately.In the pattern,multiple neural networks communicate network parameters as knowledge while learning,and a single neural network shares its learned knowledge with other networks during training,while absorbing a certain percentage of learning results from other networks,alternately learning alone and exchanging knowledge in the group.Experimental results based on several publicly available datasets show that DCL with only two networks can achieve up to 3.44%improvement in learning compared to independent learning.The number of networks performing DCL is increased to 6 further,which increases the learning by up to 2.74%.DCL is beneficial for training better neural networks.
作者
张仁斌
王龙
周泽林
左艺聪
谢昭
ZHANG Renbin;WANG Long;ZHOU Zelin;ZUO Yicong;XIE Zhao(School of Computer Science and Information Engineering,Hefei University of Technology,Hefei 230601,China;Key Laboratory of Knowledge Engineering with Big Data,Hefei University of Technology,Hefei 230601,China;Anhui Province Key Laboratory of Industry Safety and Emergency Technology,Hefei University of Technology,Hefei 230601,China)
出处
《智能计算机与应用》
2022年第12期153-158,共6页
Intelligent Computer and Applications
基金
国家重点研发计划专项资助项目(2016YFC0801804,2016YFC0801405)
中央高校基本科研业务费专项资金资助项目(PA2019GDPK0074)。
关键词
神经网络
深度学习
知识传递
网络参数
交流学习
neural network
deep learning
knowledge transfer
network parameter
communication learning