摘要
主元分析(PCA)和次元分析(MCA)是用于特征提取、数据压缩、频率估计、曲线拟合等信号处理的基本技术.以神经网络来实现PCA和MCA是当今研究的一大热点,相关矩阵R的特征值重数不为1时的主、次元分析则是其中一大难题.本文提出了一种新的学习算法,使得在输入数据的相关矩阵含多重特征值时。
Principal Component Analysis(PCA) and Minor Component Analysis(MCA) are essential techniques in signal processing such as feature extraction,data compression,frequency estimation and curve fitting.Recently,there has been much interest in the connection between PCA,MCA and neural networks.It is a difficult problem in PCA and MCA when eigenvalues of covariance matrix R are not distinct.A new learning algorithm that the weight vectors will converge to orthonormal eigenvectors is proposed in this paper to solve the problem.
出处
《电子学报》
EI
CAS
CSCD
北大核心
1996年第4期12-16,共5页
Acta Electronica Sinica
基金
攀登计划资助
国家自然科学基金
关键词
神经网络
主元分析
次元分析
学习算法
特征矢量
Neural networks,Principal component analysis(PCA),Minor component analysis(MCA),Learning algorithm,Eigenvectors