期刊文献+

基于贝叶斯理论的支持向量机综述 被引量:8

REVIEW ON SUPPORT VECTOR MACHINE BASED ON BAYES' THEOREM
下载PDF
导出
摘要 支持向量机(SVM)以其坚实的理论基础,和在机器学习领域表现出的良好推广性能,获得了越来越广泛的关注。为更好地推进其发展,科研工作者们借鉴统计学中经典的贝叶斯理论,做了大量工作,例如:引进贝叶斯理论中先验知识、后验概率等概念,改进支持向量机中的判别准则;或利用贝叶斯理论估计支持向量机中的参数w、正规化参数以及核参数等。目前已取得不错的效果,使支持向量机理论更具有实用价值。 Support Vector Machines(SVMs) are getting growing concerns due to its sound foundation of theories as well as its preferable popularising performance in the field of machine learning.In order to further promote its development,a lot of works have been doing by the scientific and technological personnel referring to classical Bayes' theorem in Statistics.For example,the concepts of priori knowledge and posterior probability in Bayes' theorem are introduced to improve the judging criterion on SVMs;or Bayes' theorem is employed to estimate the parameter w,normalisation parameter and kernel parameter of SVMs,etc.,and all of these have achieved quite satisfying effect,which makes the SVM theory more valuable in practice.In this paper,we are to summarise the works done in these areas.
作者 苏展 徐立霞
出处 《计算机应用与软件》 CSCD 2010年第5期179-181,193,共4页 Computer Applications and Software
关键词 支持向量机 贝叶斯理论 先验概率 后验概率 Support vector machine Bayes' theorem Prior probability Posterior probability
  • 相关文献

参考文献11

  • 1Vapnik.统计学习理论[M].张学工,译.北京:电子工业出版社,2004.
  • 2Vapink V N.统计学习理论的本质[M].张学工,译.北京:清华大学出版社,2000.
  • 3Simard P,Victorri B,Lecun Y,et al.Tangent prop-formalism for specifying selected invariance in an adaptive network[M] //Moody J E,Hanson S J,Lioomann R P.Advances in Neural Information Processing Systems 4,San Mateo,CA,1992:895-903.
  • 4Mika S,Ratsch G,Weston J,et al.Invariant feature extraction and classification in kernel spaces[M] //Solla S A,Leen T K,Muller K R.Advances in Neural Information Processing Systems 12,MIT Press,2000:526-532.
  • 5Scholkopf B B,Simard P,Smola J,et al.Prior knowledge in support vector kernels[M] //Jordan M I,Kearns M J,Solla S A.Advances in Neural information processings,systems,volume 10,Cambridge,MA,MIT Press.1998:640-646.
  • 6Girosi,Chan F.Prior knowledge and the Creation of Virtual Examples for RBF Networks.IEEE Workshop on Neural Networks for Signal Processing,Cambridges.MA,September 1995.
  • 7Segman J,Ribomstein J,Zeevi Y Y.The canonical coordinates method for pattern deformation,Theoretical and computational considerations.IEEE Transactions on Pattern Analysis and Machine Intelligence,1992,14:1171-1183.
  • 8吴高巍,陶卿,王珏.基于后验概率的支持向量机[J].计算机研究与发展,2005,42(2):196-202. 被引量:12
  • 9Sollich P.Bayesian methods for support vector machines:Evidence and predictive class probabilities.Machine Learning,2002,46(1):21-52.
  • 10Kowk J T.The Evidence Framework Applied to Support Vector Machines.IEEE Transaction on Neural network,2000,11(5):1162-1173.

二级参考文献21

  • 1张翔,肖小玲,徐光祐.基于最大熵估计的支持向量机概率建模[J].控制与决策,2006,21(7):767-770. 被引量:12
  • 2J. Schawe-Taylor, N. Cristianini. On the generalization of soft margin algorithms. IEEE Trans. on Information Theory, 2002,48(10): 2721~2735.
  • 3R. Fletcher. Practical Methods of Optimization, 2nd Edition.New York: John Wiley and Sons, Inc., 1987.
  • 4C. Cortes, V. Vapnik. Support vector networks. Machine Learning, 1995, 20(3): 273~297.
  • 5V. Vapnik. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
  • 6V. Vapnik. Statistical Learning Theory. New York: Addison-Wiley, 1998.
  • 7L.G. Valiant. A theory of the learnable. Communications of the ACM, 1984, 27(11): 1134~1142.
  • 8M. Anthony, N. Biggs. Computional Learning Theory. Boston:Cambridge University Press, 1992.
  • 9R.O. Duda, P. E. Hart, D. G. Stork. Pattern Classification.Second Edition. New York: John Wiley & Sons, 2001.
  • 10J.C. Platt. Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. In: Advances in Large Margin Classifies. Cambridge, MA: MIT Press. 1999.

共引文献65

同被引文献100

引证文献8

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部