摘要
为了解决前馈神经网络训练收敛速度慢、易陷入局部极值及对初始权值依赖性强等缺点,提出了一种基于反传的无限折叠迭代混沌粒子群优化(ICMICPSO)算法训练前馈神经网络(FNNs)参数。该方法在充分利用BP算法的误差反传信息和梯度信息的基础上,引入了ICMIC混沌粒子群的概念,将ICMIC粒子群(ICMICPS)作为全局搜索器,梯度下降信息作为局部搜索器来调整网络的权值和阈值,使得粒子能够在全局寻优的基础上对整个空间进行搜索。通过仿真实验与多种算法进行对比,结果表明在训练和泛化能力上ICMICPSO-BPNN方法明显优于其他算法。
In order to overcome the shortcomings of feedforward neural network's slow convergence, involving the local optimal and depending on the initial weights, this paper proposed a new method to train feedforward neural networks (FNNs) parameters based on the iterative chaotic map with infinite collapses particle swarm optimization(ICMICPSO)algorithm. This algorithm made full use of the information of BP's error back propagation and gradient. It used ICMICPS as the global optimizer to adjust the neural networks' weights and thresholds, when network parameters converge around global optimum. And it used gradient information as a local optimizer to accelerate the modification at a local scale. Compared with other algorithms, results show that the performance of the ICMICPSO-BPNN method is superior to the contrast methods in training and generalization ability.
出处
《计算机应用研究》
CSCD
北大核心
2014年第1期120-123,133,共5页
Application Research of Computers
基金
国家自然科学基金资助项目(61074153)
关键词
前馈神经网络
BP网络
粒子群优化
混沌映射
feedforward neural networks back-propagation neural networks particle swarm optimization chaos map