期刊文献+

人脸特征选择中的SVM泛化误差估计 被引量:3

SVM generalization error estimation for facial feature selection
下载PDF
导出
摘要 为了研究在人脸特征选择中用支持向量机(SVM)泛化误差界作特征选择判据的有效性问题,结合过滤(Filter)和封装(Wrapper)模型构造了人脸特征选择及识别的新框架,将小波变换(WT)和核主元分析(KPCA)作为Filter模型,最小化SVM的VC维(VC)留一法(LOO)误差界及支持向量span误差界作为Wrapper模型的特征选择判据;通过递归特征排除法(RFE)在UMIST人脸图像库上进行人脸特征选择及识别实验。实验结果表明:判据为VC维的LOO误差界和支持向量span误差界时,特征维数可以分别降低到80和70,而分类识别率仍然能达到94%以上,表明本文所提出的特征选择判据和特征搜索策略是解决人脸特征选择问题的一种有效方法。 To investigate the validity of Support Vector Machine(SVM) generalization error bounds as the feature selection criterion, a novel framework of facial feature selection based on Filter and Wrapper approaches was proposed. By taking a Wavelet Transformation(WT) and Kernel Principal Component Analysis(KPCA) as a Filter approach, the Vapnik-Chervonenkis(VC) Leave-one-Out (LOO) error bound was minimized. Then, the span bound of support vector was regarded as the feature selection criterion of Wrapper approach. Finally, Recursive Feature Elimination(RFE) search strategy was used for searching optimum facial subset. The experiments on UMIST face database were executed by the proposed method. The experimental results indicate that the facial feature dimensions can be re- duced to 80 and 70, respectively, while both of the classification accuracies remain over 94%, so the proposed feature selection criterion and search strategy are effective for facial feature selection.
出处 《光学精密工程》 EI CAS CSCD 北大核心 2008年第8期1452-1458,共7页 Optics and Precision Engineering
基金 国防"十一五"基础科研基金资助项目(No.C10020060355) 国家863高技术研究发展计划资助项目(No.2007AA01Z423) 重庆市科技攻关研究项目(No.CSTC2007AC2018)
关键词 SVM泛化误差界 人脸特征选择 Filter模型 Wrapper模型 递归特征排除法 Support Vector Machine(SVM) generalization error facial feature selectionl Filter approach Wrapper approach Recursive Feature Elimination(RFE)
  • 相关文献

参考文献16

  • 1DASH M, LIN H. Feature selection for classification [J]. Intelligent Data Analysis, 1997 (1):131-156.
  • 2JOHN G, KOHAVII R, PFLEGER K. Irrelevant features and the subset selection problem[C]. The Eleventh International Conference on Machine Learning, 1994 : 121-129.
  • 3BURGES J C. A tutorial on support vector machines for pattern recognition [J]. Data Mining Knowledge Discovery, 1998, 2(2) :121-167.
  • 4赵吉文,刘永斌,孔凡让,张平,孙丙宇.基于SVM和遗传算法的新型直线电机结构参数优化[J].光学精密工程,2006,14(5):870-875. 被引量:8
  • 5VAPNIK V. Statistical Learning Theory [M]. New York:John Wiley and Sons Inc. , 1998.
  • 6WESTONN J, MUKHERJEE S, CHAPELLE O. et al.. Feature selection for SVM [J]. In Advances in Neural Information Processing Systems, 2000(13) : 668-674.
  • 7GRANDVALET Y, CANU S. Adaptive scaling for feature selection in SVM[J]. Advances in Neural Information Processing Systems, 2003, 15 : 553-560.
  • 8HOLGER F, ANDREAS Z. Feature subset selection for support vector machines by incremental regularized risk minimization [J]. Proc. Int. Joint Conf. on Neural Networks, 2004(3):2041-2046.
  • 9RAKOTOMAMONJY A. Variable selection using SVM based criteria [J]. Machine Learning Research ,2003,3: 1357-1370.
  • 10DUAN K, KEETHI S S. Evaluation of simple performance measures for tuning SVM hyper parameters [J]. Neurcomputing,2003, 51:41-59.

二级参考文献31

共引文献21

同被引文献35

引证文献3

二级引证文献23

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部