期刊文献+

动量VLBP神经网络在缺陷接地电路中的应用

Application of the Momentum VLBP Neural Network to a Circuit with Defected Ground Structures
下载PDF
导出
摘要 采用一种改进的人工神经网络反向传播算法(BP算法),将动量方法和可变学习速度的BP算法(VLBP算法)结合,并且在每个样本点更新权值和偏置值,这种算法称为动量VLBP算法。用C语言实现该算法,并将其运用到对一种新型组合式非周期性缺陷接地结构(CNPDGS)低通滤波器的神经网络建模之中。以CNPDGS的结构尺寸和频率为输入样本,传输系数参数为输出样本,建模成功后,在样本范围内输入结构尺寸和频率能够很快得出准确的传输系数。结果表明应用动量VLBP算法的神经网络相对于FDTD分析方法可以节省大量的时间,并且与基本的BP算法相比,可以加速算法收敛、减少训练时间。 An improved artificial neural network (ANN) model of lowpass filter with combinatorial nonperiodic defected ground structures (CNPDGS) is developed in this paper. The momentum VLBP algorithm used in the model integrates the momentum with the variable learning rate backpropagation (VLBP), updates the weight and the bias at each sample point, and is accomplished by C language. The structure size of CNPDGS and the frequency are defined as the input samples of the ANN model, the parameters of transmission coefficient are defined as the output samples. Within the range of training, the parameters of transmission coefficient can be obtained correctly and quickly from the model which has been trained successfully. The result indicates that the momentum VLBP algorithm is more timesaving than FDTD and more efficient than the basic BP algorithm.
出处 《微波学报》 CSCD 北大核心 2007年第6期36-39,共4页 Journal of Microwaves
基金 国家自然科学基金资助项目(编号:60371029)
关键词 神经网络 反向传播算法 组合式非周期性缺陷接地结构 动量VLBP Artificial neural network, Backpropagation algorithm, Combinatorial nonperiodic defected ground structures, Momentum variable learning rate backpropagation
  • 相关文献

参考文献10

  • 1Rumelhart D E, Hinton G E, Williams R J. Learning representations by back-propagating errors. Nature, 1986, 323: 533- 536
  • 2Zhang Q J, Gupta K C. Neural Networks for RF and Microwave Design. Norwood, MA: Artech House, 2000
  • 3苏高利,邓芳萍.论基于MATLAB语言的BP神经网络的改进算法[J].科技通报,2003,19(2):130-135. 被引量:170
  • 4Vogl T P, Mangis J K, Zigler A K, et al. Accelerating the convergence of the backpropagation method. Bio Cybern, 1988, 59(9) : 256 -264
  • 5Jacobs R A. Increased rates of convergence through learning rate adaptation. Neural Networks, 1988, 1 (4) : 295 - 308
  • 6Tollenaere T. SuperSAB: fast adaptive back propagation with good scaling properties. Neural Networks, 1990, 3 (5) : 561 -573
  • 7Rigler A K, Irvine J M, Vogl T P. Rescaling of variables in back propagation learning. Neural Networks, 1991, 4 (2) :225 - 229
  • 8Hagan M T Demuth H B Beale M H 戴葵(译).神经网络设计[M].北京:机械工业出版社,2002.210-211,243-244.
  • 9Dal Ahn, Park Jun-Seok, Kim Chul-Soo, et al. A design of the low-pass filter using the novel microstrip defected ground structure. IEEE Trans MTT, 2001,49( 1 ) : 86 - 93
  • 10Park Jun-Seok Park, Yun Jun-Sik, Dal Ahn. A design of the novel coupled-line bandpass filter using defected ground structure with wide stopband performance. IEEE Trans MTT, 2002, 50 (9) : 2037 - 2043

二级参考文献18

  • 1Rumelhart D E, Hinton G E, Williams R J. Learninginternal repr esentatio ns by error propagation[A].Rumelhart D E James L.McClelland J L. Parallel di stributed processing: explorations in the microstructure of cognition[C], vol ume 1, Cambridge, MA:MIT Press, 1986.318~362.
  • 2Neural Network Toolbox User's Guide .The Mathworks,inc. 1999.
  • 3Fahlman S E. Faster-learning variations on back-propagation: an e mpirical study[A].Touretzky D,Hinton G,Sejnowski T. Proceedings of the 1988 C onnectionist Models Summer School[C].Carnegic Mellon University,1988,38~51.
  • 4Jacobs R A. Increased rates of convergence through learning rate adaptation[J]. Neural Networks,1988,1:295~307.
  • 5Shar S, Palmieri F. MEKA-a fast, local algorithm for training feedforwa rd neural networks[A]. Proceedings of the International Joint Conference on Ne ural Networks[C]. IEEE Press, New York, 1990.41~46.
  • 6Watrous R L. Learning algorithms for connectionist network: appli ed gradie nt methods of nonlinear optimization[A]. Proceedings of IEEE International Con ference on Neural Networks[c]. IEEE Press, New York, 1987.619~627.
  • 7Shar S,Palmieri F,Datum M.Optimal filtering algorithms f or fast l earning in feedforward neural networks[J]. Neural Networks,1992, 5(5):779~7 87.
  • 8Martin R,Heinrich B. A Direct Adaptive Method for F aster Backpropagation Learning: The RPROP Algorithrm[A]. Ruspini H. Proceedi ngs of the IEEE Interna t ional Conference on Neural Networks (ICNN)[C]. IEEE Press, New York. 1993.58 6~591.
  • 9Fletcher R,Reeves C M. Function minimization by conjugate gra dients[J]. Computer Journal ,1964,7:149~154.
  • 10Powell MJD. Restart procedures for the conjugate gradient metho d[J]. Mathematical Programming, 1977, 12: 241~254.

共引文献174

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部