摘要
在研究Hopfield神经网络时通常都假设输出响应函数是光滑的增函数.但实际应用中遇到的大多数函数都是非光滑函数.因此,本文将通常论文中Hopfield神经网络的输出响应函数连续可微的假设削弱为满足L ipschitz条件.通过引入Lyapunov函数的方法,证明了Hopfield神经网络全局指数收敛的一个充分性定理.并且由此定理获得该类网络全局指数稳定的几个判据.这定理与判据是近期相应文献主要结果的极大改进.
Hopfield neural networks are usually discussed under the assumption that all output response functions are smooth and monotone increasing. However, output responses are nonsmooth in most practical applications. In this paper, continuous differentiable conditios of output response functions of Hopfied neural networks in usual papers is reduced to Lipschitz condition. A theorem on globally exponential convergence of solutions of the networks is shown by a Lapunov functional. Some new criteria on globally exponential stability of the networks are obtained. These results greatly improve the main results of recent related papers.
出处
《控制理论与应用》
EI
CAS
CSCD
北大核心
2006年第2期302-305,共4页
Control Theory & Applications
基金
电子科技大学重点基金资助项目
国家民委重点基金资助项目(20040816012)