期刊文献+

基于多尺度重采样思想的类指数核函数构造 被引量:4

Design of An Exponential-like Kernel Function Based on Multi-scale Resampling
下载PDF
导出
摘要 该文按照多尺度重采样思想,构造了一种类指数分布的核函数(ELK),并在核回归分析和支持向量机分类中进行了应用,发现ELK对局部特征具有捕捉优势。ELK分布仅由分析尺度决定,是单参数核函数。利用ELK对阶跃信号和多普勒信号进行Nadaraya-Watson回归分析,结果显示ELK降噪和阶跃捕捉效果均优于常规Gauss核,整体效果接近或优于局部加权回归散点平滑法(LOWESS)。多个UCI数据集的SVM分析显示,ELK与径向基函数(RBF)分类效果相当,但比RBF具有更强的局域性,因此具有更细致的分类超平面,同时分类不理想时可能产生更多的支持向量。对比而言,ELK对调节参数敏感性低,这一性质有助于减少参数优选的计算量。单参数的ELK对局域特征的良好捕捉能力,有助于这类核函数在相关领域得到推广。 Based on multi-scale resampling, an Exponential-Like Kernel(ELK) function is designed, and evaluated with local feature extraction in kernel regression and Support Vector Machine(SVM) classification. The ELK is a one-parameter kernel, whose distribution is controlled only by the resolution of analysis. With block and Doppler noisy signals, Nadaraya-Watson regression with ELK mainly shows more noise and step error than with Gaussian kernel, it also has better precision and is more robust than LOcally WEighted Scatterplot Smoothing(LOWESS). Data sets from the UCI Machine Learning Repository used in SVM test demonstrate that, ELK has nearly equal classification accuracy as RBF does, and its locality results in more detailed margin hyperplanes, in consequence, a big number of support vectors in low classification accuracy situation. Moreover, the insensitivity of ELK to the adjustive coefficient in kernel methods shows the potential to facilitate the parameter optimization progress. ELK, as a single parameter kernel with significant locality, is hopefully to be extensively used in relative kernel methods.
出处 《电子与信息学报》 EI CSCD 北大核心 2016年第7期1689-1695,共7页 Journal of Electronics & Information Technology
基金 国家自然科学基金(11472158)~~
关键词 多尺度重采样 Nadaraya-Watson回归 支持向量机 类指数核函数 Multi-scale resampling Nadaraya-Watson regression Support Vector Machine(SVM) Exponential-Like Kernel(ELK) function
  • 相关文献

参考文献2

二级参考文献12

  • 1Ingo Steinwart, On the influence of the kernel on the generalization ability of support vector machines. Department of mathematics and computer science, Friedrich Schiller University(Jena): Technical Report TR-01-01, 2001 (Available as http://www. minet. uni-jena. de /Math-Net /reports/rep-com.html).
  • 2Shun-ichi Amari, Si Wu. Improving support vector machine classifiers by modifying kernel functions. Neural Networks,1999, 12:783-789.
  • 3Vapnik V. The Nature of Statistical Learning Theory. New York:Verlag, 1995.
  • 4Scholkpf B. Support vector learning[Ph D dissertation]. Berlin University, Berlin, 1997.
  • 5Oja E. Subspace Methods of Pattern Recognition. Hertfordshire: Research Studies Press Ltd. ,1983.
  • 6Lodha S K, Franke R. Scattered data techniques for surfaces.In: Proceedings of Dagstuhl Conference on Scientific Visualization. Washington, 1999. 182-222.
  • 7Guan L T et al. Computer Aided Geometric Design. Beijing:CHEP & Springer, 1999.
  • 8Vapnik V,Chapelle O. Bounds on error expectation for support vector machines. Neural Computation, 2000, 12( 9): 2013-2036.
  • 9Chapella O, Vapnik V. Model selection for support vector machines. In: Solla S A, Leen T K,Muller K Reds. Advances in Neural Information Processing Systems. Cambridge: The MIT Press, 1999.
  • 10Chapella O,Vapnik V. Choosing multiple parameters for support vector machines. Machine Learning, 2002, 46 (1) : 131-159.

共引文献39

同被引文献34

引证文献4

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部