摘要
在研究双向联想记忆神经网络时,通常都假设输出响应函数是光滑的增函数,但实际应用中遇到的大多数函数都是非光滑函数。因此,本文将双向联想记忆神经网络的输出响应函数连续可微的假设削弱为满足Lipschitz条件,通过引入Lyapunov函数,利用不等式的方法,证明了双向联想记忆神经网络全局指数稳定性的一个定理。
Bidirectional associate memory neural networks are usually discussed under the assumption that all output response functions are smooth and monotone increasing. However, output response are nonsmooth in most practical applications . In this paper continuous differentiable conditions of output response functions of bidirectional associate memory neural networks are reduced to Lipschitz condition. Global exponential stability of bidirectional associate memory neural networks is shown by a Lapunov functional with method of inequality.
出处
《重庆师范大学学报(自然科学版)》
CAS
2007年第2期39-42,共4页
Journal of Chongqing Normal University:Natural Science
基金
四川省教育厅自然科学科研基金项目(No.2006c056)
关键词
神经网络
双向联想记忆神经网络
全局指数稳定
全局指数收敛
LIPSCHITZ条件
neural networks
bidirectional associate memory neural networks
global exponential stability
global exponential convergence
Lipschitz condition.