期刊文献+

基于Gram-Schmidt过程的判别变量筛选方法 被引量:3

Variable selection in discriminant analysis based on Gram-Schmidt process
原文传递
导出
摘要 利用Gram-Schmidt过程,在自变量集合中选择对判别分类解释性最强的信息,删除对分类无显著解释作用的信息以及重复解释的信息,并把挑选出来的解释变量集合变换成若干直交变量.一方面实现了判别分析模型中的变量筛选,同时也解决了自变量多重共线条件下的有效建模问题.在选入变量的过程中运用F统计量检验变量的判别作用,更容易被统计应用人员所接受.为了说明所提算法的合理性和有效性,以Fisher判别分析建模为例,通过仿真数据建模取得了合理准确的分析结论. A new linear discriminant analysis modeling method based on Gram-Schmidt process was introduced,which firstly selected the most effective variables for classification in the independent variables set.In the meantime,the insignificant variables and the redundant information were identified and removed from the independent variables set.The selected variables were transformed into a set of orthogonal vectors by Gram-Schmidt process.Not only can the proposed method accomplish variable selection in linear discrimination,but also overcome the multi-collinearity problem effectively.Since F-statistic works as a criterion to verify the discrimination effect of each selected variable,it helps analysts to understand the analysis result.In order to test the reasonableness and effectiveness of the method,a simulation experiment was carried out.The result indicates that the proposed method can lead to a reasonable and precise conclusion.
出处 《北京航空航天大学学报》 EI CAS CSCD 北大核心 2011年第8期958-961,共4页 Journal of Beijing University of Aeronautics and Astronautics
基金 国家自然科学基金资助项目(70771004 71031001 70821061)
关键词 Gram-Schmidt正交变换 判别分析 变量筛选 多重相关性 Gram-Schmidt orthogonal transformation discriminant analysis variable selection multiple correlation
  • 相关文献

参考文献10

  • 1Chert S, Billings S A, Luo W. Orthognnal least squares methods and their application to non-linear system identification [ J ]. In- ternational Journal of Control, 1989,50 (5) : 1873 - 1896.
  • 2Chen S, Cowan C F N, Grant P M. Orthogonal least squares learning algorithm for radial basis function networks [ J ]. IEEE Transaction on Neural Networks, 1991,2(2) :302 - 309.
  • 3Urbani D,Roussel-Ragot P, Personnaz L, et al. The selection of neural models of nonlinear dynamical systems by statistical tests [ C ]//Vlontzos J, Hwang J, Wilson E. Neural Networks for Sig- nal Processing IV. Piscataway, NJ : IEEE, 1994 : 229 - 237.
  • 4Oussar Y, Dreyfus G. Initialization by selection for wavelet net- work training [ J ]. Neurocomputing,2000 ( 34 ) : 131 - 143.
  • 5Vincent P, Bengio Y. Kernel matching pursuit [ J ]. Machine Learning, 2001 ( 48 ) : 165 - 187.
  • 6Stoppiglia H,Dreyfus G,Dubois R,et al. Ranking a random fea- ture for variable and feature selection [ J ]. The Journal of Ma- chine Learning Research ,2003 ( 3 ) : 1399 - 1414.
  • 7Zheng Wenming,Zou Cairong, Zhao Li. Real-time face recogni- tion using Gram-Schmidt orthogonalization for LDA [ C ]//Pro- ceedings-International Conference on Pattern Recognition. Piscat- away, NJ : IEEE,2004:403 - 406.
  • 8He Yunhui. Modified generalized discriminant analysis using ker- nel Gram-Schmidt orthogonalization in difference space for face recognition[ C ]//Proceedings 2009 2rid International Workshop on Knowledge Discovery and Data Mining. Piscataway, NJ: IEEE ,2009:36 - 39.
  • 9王惠文,陈梅玲,Gilbert Saporta.Gram-Schmidt回归及在刀具磨损预报中的应用[J].北京航空航天大学学报,2008,34(6):729-733. 被引量:14
  • 10Johnson R A, Wichern D W. Applied multivariate statistical analysis [ M ]. 6 th ed. Beijing : Tsinghua University Press,2008.

二级参考文献7

  • 1Hoerl A E. Application of ridge analysis to regression problems [J]. Chemical Engineering Progress, 1962,58:54 - 59
  • 2Neter J, Wasserman W, Kutner M H. Applied linear regression models[ M ]. New York: Richard D Irwin Ine,1983
  • 3Wold S, Martens H, Wold H. The multivariate calibration problem in chemistry solved by the PLS method [C]//Ruhe A, Kagstrom B. Proc Conf Matrix Pencils, Lectures Notes in Mathematics. Heidelberg: Springer-Verlag, 1983
  • 4Tenenhaus M. La regression PLS theorie et pratique[M]. Paris: Editions Technip, 1998
  • 5Jaln S K, Gunawardena A D. Linear algebra: an interactive approach [ M ]. Beijing : China Machine Press ,2003
  • 6Lazraq A, Cleroux R, Gauchi J P. Selecting both latent and explanatory variables in PLS1 regression model [J]. Chemometrics and Intelligent Laboratory Systems,2003,66 : 117 - 126
  • 7刘强,尹力.一种简化递推偏最小二乘建模算法及其应用[J].北京航空航天大学学报,2003,29(7):640-643. 被引量:7

共引文献13

同被引文献13

引证文献3

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部