摘要
在传统的BP算法基础上 ,提出了一种改进的BP学习算法 .先加入描述网络复杂性的量 ,使算法能够考虑到网络的连接复杂性 ,进而有可能删除掉冗余的连接甚至节点 ;接着提出对网络的学习步长的动态调整 ,以此来尽量避免传统学习中的学习速度过慢和反复震荡 ;然后给出新的算法是高阶非线性收敛的证明 ;最后通过实验说明新的BP算法在一定程度上可减少网络的复杂性 ,有着比传统算法更快的收敛速度 .
Based on conventional BP algorithm, a modified BP learning algorithm is proposed. A factor that is used to describe the ANN's complexity is added in order to evaluate the complexity. This makes it possible that the redundant connections can be decreased, and the redundant neurons can be nullified even. Then, a method that can dynamically adjust the learning step is presented. So the learning step can be speeded and the sway phenomenon can be minimized.Next, it is proved that the new algorithm is super linearly convergent. Finally, simulation results illustrate that the algorithm can decrease ANN's complexity with higher convergence speed under certain conditions than conventional BP algorithm.
出处
《东南大学学报(自然科学版)》
EI
CAS
CSCD
北大核心
2001年第4期40-42,共3页
Journal of Southeast University:Natural Science Edition
关键词
前馈神经网络
BP学习算法
收敛速度
学习步长
feedforward neural networks
BP learning algorithm
convergence speed
learning step