期刊文献+

高斯径向基核函数参数的GA优化方法 被引量:9

GA based parameter optimization of Gauss kernel function
下载PDF
导出
摘要 核Fisher判别法KFDA(Kernel Fisher Discriminant Analysis)在模式分类应用中通常采用高斯径向基函数做核函数,但高斯径向基函数中参数σ的选取对模式分类的效果影响较大。参数σ的选取目前仅凭经验,缺乏自动选取方法。提出采用遗传算法GA(Genetic Algorithm)实现自动优化参数σ使KFDA具有自适应性的方法,用GA优化参数σ所确定的高斯径向基核函数应用于KFDA时,模式分类的可分性测度大。该方法在电机滚动轴承故障分类实验表明优于其他KFDA分类效果。 KFDA(Kemel Fisher Discriminant Analysis) usually adopts the Gauss radial basis function as its kernel function for mode classification ,in which the selection of parameter influences the result greatly. The value of the parameter is normally determined by experience. It is proposed to use GA in the automatic optimization of the parameter to improve the adaptability of KFDA,which makes the classification resolution higher. Its application in fault classification of motor rolling bearing shows its superiority.
出处 《电力自动化设备》 EI CSCD 北大核心 2008年第6期52-55,共4页 Electric Power Automation Equipment
关键词 核Fisher判别法 遗传算法 核函数 滚动轴承 故障分类 KFDA GA kernel function rolling bearing fault classification
  • 相关文献

参考文献11

  • 1LU J W,PLATANIOTIS K N,VENETSANOPOULOS A N. Face recognition using LDA based algorithms[J]. IEEE Trans on Neural Networks ,2003,14( 1 ) : 195 - 200.
  • 2LU J W,PLATANIOTIS K N,VENETSANOPOULOS A N. Regularization studies of linear discriminant analysis in small sample size scenarios with application to face recognition[J]. Pattern Recognition Letters ,2005,26(2) : 181 - 191.
  • 3甘俊英,张有为.模式识别中广义核函数Fisher最佳鉴别[J].模式识别与人工智能,2002,15(4):429-434. 被引量:24
  • 4MARTINEZ A M,KAK A C. PCA versus LDA[J]. IEEE Trans on Pattern Analysis and Machine Intelligence,2001,23 (2): 228 - 233.
  • 5SUN Yanfeng,ZHANG Wenli,GU Xiaojiong,et al. Optimal partition algorithm of RBF neural network and its applications to stock price prediction [ C ] //Proceedings of International Conference on Intelligent Information Technology. Beijing,China: Posts & Telecom Press,2002:448 - 454.
  • 6边肇祺 张学工.模式识别[M].北京:清华大学出版社,2004..
  • 7柳桂国,柳贺,黄道.模式分析的核函数设计方法及应用[J].华东理工大学学报(自然科学版),2007,33(3):405-409. 被引量:4
  • 8HOEGAERTS L,SUYKENS J A K,VANDEWALLE J,et al. Subset based least squares subspace regression in RKHS[J]. Neurocomputing, 2005,63 ( 1 ) : 293 - 323.
  • 9BAUDAT G,ANOUAR F. Feature vector selection and projection using kernels[J]. Neurocomputing,2003,55( 1/2) :21-38.
  • 10BAUDAT G,ANOUAR F. Generalized discriminant analysis using a kernel approach[J]. Neural Computation,2000,12(10) : 2385 - 2404.

二级参考文献12

  • 1陈国良,遗传算法及其应用,1996年,5页
  • 2Fisher R A. The Statistical Utilization of Multiple Measurements. Annals of Eugenics, 1938, 8: 376- 386
  • 3Mika S, Ratsch G, Weston J, Scholkopf B, Muller K. Fisher Discriminant Analysis with Kernels. In: Proc of the IEEE Neural Networks for Signal Processing Workshop, Madison, 1999, 41 - 48
  • 4Scholkopf B, Mika S, et al. Input Space Versus Feature Space in Kernel-Based Methods. IEEE Trans on Neural Networks, 1999, 10(5): 1000- 1017
  • 5Weston J, Watkins C. Support Vector Machines for Multi-Class Pattern Recognition. In: Proc of 7th European Symposium on Artificial Neural Networks, Bruges, Belgium, 1999, 219- 224
  • 6Foley D H, Sammon J W. An Optimal Set of Discriminant Vectors. IEEE Trans on Computers, 1975, 24(3) : 281 - 289
  • 7Baudat G, Anouar F. Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation, 2000, 12 : 2385 - 2404
  • 8Shun-iehi Amari,Si Wu.Improving support vector machine classifiers by modifying kernel functions[J].Neural Networks,1999,12:783-789.
  • 9Xie Xiaggen,Nashed M Z.The Backus-Gilbert methods for signals in reproducing kernel Hilbert spaces and wavelet subspaces[J].Inverse Problems,1994,10:785-804.
  • 10Cui Minggen.Wavelet analysis of diferential operator spline in H1 (R)space[J].Ukr Acad Sci Dokl Math,1996,4(10):12231-12232.

共引文献92

同被引文献95

引证文献9

二级引证文献98

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部