期刊文献+

一种提高神经网络集成差异性的学习方法 被引量:9

An Approach to Improving Diversity of Neural Network Ensemble
下载PDF
导出
摘要 集成学习已经成为机器学习的研究方向之一,它可以显著地提高分类器的泛化性能.本文分析了Bag-ging及AdaBoost集成方法,指出了这两种方法的缺陷;然后提出了一种新的基于神经网络的分类器集成方法DBNNE,该方法通过生成差异数据增加集成的差异性;另外,当生成一个分类器后,采用了测试方法确保分类器集成的正确率;最后针对十个标准数据集进行了实验研究,结果表明集成算法DBNNE在小规模数据集上优于Bagging及AdaBoost集成方法,而在较大数据集上也不逊色于这两种集成方法. Ensemble learning has become one of research fields of machine learning, it dramatically improves generalization performance of classifier.After analyzing ensemble approach to both Bagging and Adaboost, we point out their some flaws. Then we present a novel approach to neural network ensemble,called DBNNE below. In this method,a diverse data set is generated to increase ensemble diversity. Moreover, to ensure high accuracy of ensemble, we test performanee of ensemble when a classifier is added to ensemble . Finally, we experiment on ten representative data sets. The results show that DBNNE achieves higher predictive aceuracy than Bagging and AdaBoost on small data sets and comparable performance on larger data sets.
作者 李凯 黄厚宽
出处 《电子学报》 EI CAS CSCD 北大核心 2005年第8期1387-1390,共4页 Acta Electronica Sinica
基金 国家"十五"重点科技攻关项目(No.2002BA407B) 国家自然科学基金(No.60443003)
关键词 神经网络 集成 小规模数据集 差异性 泛化 neural network ensemble small data sets diversity generalization
  • 相关文献

参考文献12

  • 1Dietterich T G. Machine learning research: four current directions[J].AI Magazine, 1997,18 (4):97- 136.
  • 2Hansen LK,Salamon P.Neural network ensembles[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 1990,12(10) :993 - 1001.
  • 3Krogh A, Vedelsby J. Neural network ensembles, cross validation,and active learing[A] .Tesauro G,Touretzky D S and Leen T K,eds, Advances in Neural Information Processing Systems 7 [C]. Cambridge,MA, MIT Press, 1995.231 - 238.
  • 4Partridge D, Yates W B. Engineering multiversion neural-net systems[J]. Neural Computation, 1996, 8(4) : 869 - 893.
  • 5Opitz D W,Shavlik J W. Actively searching for an effective neural-network ensemble[J]. Connection Science, 1996,8(3/4) :337 - 353.
  • 6Zhou Z H,Wu J X,Tang W. Ensembling neural networks: many could be better than all[J] .Artificial Intelligence, 2002, 137( 1 - 2) :239 -263.
  • 7Imamura K,Soule T, Heckendom B B,et al. Behavioral diversity and a probabilistically optimal GP ensemble[J]. Genetic Programming and Evolvable Machines,2003,4(3) :235 -253.
  • 8Breiman L. Bagging predictors[J]. Machine Learning, 1996, 24(2):123 - 140.
  • 9Freund Y, Schapire R E. Experiments with a new boosting algorithm [A]. Saitta L Proc of the 13^th ICML-96[C]. San Francisco, Morgan Kaufimann, 1996.148 - 156.
  • 10Kuncheva L, Whitaker C. Measures of diversity in classifier ensembles and their relationship with ensemble accuracy[J] .Machine Learning ,2003,51(2):181-207.

同被引文献91

引证文献9

二级引证文献52

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部