期刊文献+

基于核统计独立性准则的特征选择研究综述 被引量:2

Review of Feature Selection Methods Based on Kernel Statistical Independence Criteria
下载PDF
导出
摘要 希尔伯特-施密特独立性准则(Hilbert-Schmidtindependencecriterion,HSIC)是一种基于核函数的独立性度量标准,具有计算简单、收敛速度快和偏差低等优点,广泛应用于统计分析和机器学习问题中。特征选择是一种有效的降维技术,它能评估特征的重要性,并构造适合学习任务的最优特征子空间。系统综述了基于HSIC的特征选择方法,详细介绍了其中的理论基础、算法模型和求解方法,分析了基于HSIC的特征选择的优点与不足,并对未来的研究做出展望。 The Hilbert-Schmidt independence criterion(HSIC)is a kernel function-based independence criterion with the advantages of simple computation,fast convergence and low bias,which is widely used in statistical analysis and machine learning problems.Feature selection is an effective dimensionality reduction technique that evaluates the importance of features and constructs an optimal feature subspace suitable for the learning task.The HSIC-based feature selection meth-ods are systematically reviewed,in which the theoretical basis,algorithmic models and solution methods are introduced in detail,the advantages and disadvantages of HSIC-based feature selection are analyzed,and future directions are given.
作者 胡振威 汪廷华 周慧颖 HU Zhenwei;WANG Tinghua;ZHOU Huiying(School of Mathematics and Computer Science,Gannan Normal University,Ganzhou,Jiangxi 341000,China)
出处 《计算机工程与应用》 CSCD 北大核心 2022年第22期54-64,共11页 Computer Engineering and Applications
基金 国家自然科学基金(61966002) 江西省教育厅科学技术研究项目(GJJ191659) 江西省研究生创新专项资金项目(YC2021-S726)。
关键词 特征选择 希尔伯特-施密特独立性准则 核方法 机器学习 feature selection Hilbert-Schmidt independence criterion(HSIC) kernel method machine learning
  • 相关文献

参考文献9

二级参考文献169

  • 1崔文岩,孟相如,李纪真,王明鸣,陈天平,王坤.基于粗糙集粒子群支持向量机的特征选择方法[J].微电子学与计算机,2015,32(1):120-123. 被引量:9
  • 2刘涛,吴功宜,陈正.一种高效的用于文本聚类的无监督特征选择算法[J].计算机研究与发展,2005,42(3):381-386. 被引量:37
  • 3Weber R.Schek H J,Blott S.A quantitative analysis and performance study for similarity-search methods in high-dimensional spaces[C] // Proc of the 24th VLDB Conference.San Francisco:Morgan kaufmann,2008:323-331.
  • 4Chen X.An improved branch and bound algorithm for feature selection[J].Pattern Recognition Letters,2003,24 (12):1925-1933.
  • 5Aha D W,Bankert R L.A comparative evaluation of sequential feature selection algorithms[C] // Learning from Data:AI and Statistics V.Berlin:Springer,1996:199-206.
  • 6Siedlecki W,Sklansky J.A note on genetic algorithms for large-scale feature selection[J].Pattern Recognition Letters,1989,10(5):335-347.
  • 7Loughrey J,Cunningham P.Using early-stopping to avoid overtraining in wrapper-based feature selection employing stochastic search,TCDCD-2005-37[R].Dublin:Department of Computer Science,Trinity College Dublin,2005.
  • 8Zheng Z,Liu H.Spectral feature selection for supervised and unsupervised learning[C] //Proc of the ACM 24th Int Conf on Machine Learning.San Francisco:Morgan Kaufmann,2007:1151-1157.
  • 9Liu T,Liu S P,Chen Z,et al.An evaluation on feature selection for text clustering[C] // Proc of the 20th Int Conf on Machine Learning.Menlo Park,USA:AAAI,2003:177-181.
  • 10Wolf L,Shashua A.Feature selection for unsupervised and supervised inference:the emergence of sparsity in a weighted-based approach[C] // Proc of the 9th IEEE Int Conf on Computer Vision.Washington,DC,USA:IEEE Computer Society,2003:378-384.

共引文献361

同被引文献39

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部