期刊文献+

MKLasso模型的算法与模型选择

MKLasso model algorithm and model choice
原文传递
导出
摘要 在KLasso模型基础之上,引入多核函数与多核参数重新建立的一种更为广发的非线性的多核KLasso模型(MKLasso模型),采用基于梯度Boosting的思想的算法进行求解,并依据人类观察事物的一个基本特征,即人眼位于数据空间较近时能够看清细节,较远时只能够看清整体结构的特性设计了一种模型选择策略,通过实际的3个数据集设计6组试验,来验证该算法的有效性。模拟试验结果表明:MKLasso模型的预测能力明显优于KLasso模型,其预测均方误差提高了10倍;该算法运行高效,抗噪声能力强,在参数选择方面又有一定自己的优势,可以直接选择核参数,算法大大降低了调试与运算时间。 A nonlinear Mutil-kernel Lasso model which is a generalization of classical KLasso was developed based on introducing the Mutil-kernel function and parameters. To solve MKLasso, an algorithm in terms of the gradient boosting perspective was designed. The algorithm can select kernel parameters directly, so it can reduce the complexity and running time. Moreover, in order to improve the performance of the algorithm, a new model selection strategy was proposed by one 3f the principles of human vision system that is person stands near the data space, they can see .~he clear local structure of the data, person stands far away the data space, then they only can see :he c/ear global structure. The efficiency of the algorithm was proved by the 6 simulations de- ;igned by 3 real data set. Finally, a series of simulations indicate that the prediction capacity of ~IKLasso outperformance KLasso at least 10 times in terms of prediction mean square error. The lgorithm is not only efficient and rubust, but also has flexible model selection strategy to select he kernel parameters directly, which can reduce the testing and running time. 5 tabs, 2 figs, 12 efs.
出处 《长安大学学报(自然科学版)》 EI CAS CSCD 北大核心 2012年第4期105-110,共6页 Journal of Chang’an University(Natural Science Edition)
基金 国家自然科学基金项目(60975036) 中央高校基本科研业务费专项资金项目(CHD2011JC118)
关键词 Lasso 视觉原理 L1范数正则化 核方法 Lasso visual principle L1 norm regularization kernel method
  • 相关文献

参考文献12

  • 1Vapnik V N. The nature of statistical learning theory[D]. New York :Springer-Verlag,2000.
  • 2Biihlmann P, Yu B. Boosting with the L2 Loss[J]. Journal of the American Statistical Association,2003, 98(462): 324-339.
  • 3Tibshirani R. Regression shrinkage and selection via the Lasso[J], Journal of the Royal Statistical Society: Series B (Methodological) ,1996,58(1) :267-288.
  • 4Efron B, Hastie T, Johnstone I, et al. Least angle regression [J]. The Annals of Statistics, 2004,32(2): 407-499.
  • 5Rosset S,Zhu J. Piecewise linear regularized solution paths[ J ]. The Annals of Statistics, 2007? 35 ( 3 )-1012-1030.
  • 6Schlkopf B’Smola A J. Learning with kernels:support vector machines, regularization? optimization? and be-yond[E)]. Cambridge:The MIT Press,2002.
  • 7Muller K R,Mike S,Ratsch G. An introduction to kernel-based learning algorithms[J]. IEEE Transactions on Neural Networks,2001,12(2) : 181-201.
  • 8Muller K R,Mike S,Ratsch G. An introduction to kernel-based learning algorithms[J]. IEEE Transactions on Neural Networks,2001,12(2) : 181-201.
  • 9汪洪桥,孙富春,蔡艳宁,陈宁,丁林阁.多核学习方法[J].自动化学报,2010,36(8):1037-1050. 被引量:156
  • 10Roth V. The generalized lasso[J]. IEEE Transactions on Neural Networks? 2004,15(1) :16-28,[10]Gao J,Kwan P W,Shi D. Sparse kernel learning with Lasso and Bayesian inference algorithm [J]. Neural Networks,2010,23(2) *257-264.

二级参考文献3

共引文献155

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部