期刊文献+

半监督k近邻分类方法 被引量:6

Semi-supervised k-nearest neighbor classification method
原文传递
导出
摘要 加权KNN(k-nearest neighbor)方法,仅利用了k个最近邻训练样本所提供的类别信息,而没考虑测试样本的贡献,因而常会导致一些误判。针对这个缺陷,提出了半监督KNN分类方法。该方法对序列样本和非序列样本,均能够较好地执行分类。在分类决策时,还考虑了c个最近邻测试样本的贡献,从而提高了分类的正确性。在Cohn-Kanade人脸库上,序列图像的识别率提高了5.95%,在CMU-AMP人脸库上,非序列图像的识别率提高了7.98%。实验结果表明,该方法执行效率高,分类效果好。 The category information of the k-nearest neighbor labeled samples is used, but the contribution of the test sam- ples is omitted in the weighted k-nearest neighbor method, which often lead to misclassifieations. Aimed at the problem, a semi-supervised k-nearest neighbor method is proposed in this paper. The method can classify sequential samples and non-sequential samples better than the k-nearest neighbor method. In the decision process of classification, the information of c-nearest neighbor samples in the test set is used. So, classification accuracy is improved. The recognition accuracy of the method is 5.95% higher for sequential images in Cohn-Kanade face database, and 7. 89% higher for non-sequential images in Cohn-Kanade face database than it of weighted k-nearest neighbor method. The experiment shows that the method performs fast and has high classification accuracy.
出处 《中国图象图形学报》 CSCD 北大核心 2013年第2期195-200,共6页 Journal of Image and Graphics
基金 国家自然科学基金项目(60776834) 湖南省教育厅优秀青年项目(10B074)
关键词 加权KNN 贝叶斯理论 半监督KNN 流形 weighted k-nearest neighbor Bayesian theory semi-supervised k-nearest neighbor manifold
  • 相关文献

参考文献19

  • 1Yang Y M,Liu X. A re-examination of text categorization methods[A].New York,USA:Association for Computing Machinery,1999.42-49.
  • 2Li B L;Chen Y Z;Yu S W.A comparative study on automatic categorization methods for Chinese search engine[A]杭州:浙江大学出版社,2002117-120.
  • 3Cover T M,Hart P E. Nearest neighbor pattern classification[J].IEEE Transactions on Information theory,1967,(01):21-27.
  • 4Duda R O;Hart P E;Stork D G;李宏东;姚天翔.模式分类[M]北京:机械工业出版社,2006150-151.
  • 5Gora G,Wojna A. A classifier combining rule induction and KNN method with automated selection of optimal neighbourhood[A].LONDON:SPRINGER-VERLAG,2002.111-123.
  • 6Hechenbichler K,Schliep K. Weighted k-nearest-neighbor techniques and ordinal classification,[Discussion Paper 399][R].Munich,Germany:Ludwig-Maximilians University Munich,2004.
  • 7Wang B,Yong Z,Yupu Y. Generalized nearest neighbor rule for pattern classification[A].Washington DC,USA:Institute of Electrical and Electronics Engineers,2008.8465-8470.
  • 8陈振洲,李磊,姚正安.基于SVM的特征加权KNN算法[J].中山大学学报(自然科学版),2005,44(1):17-20. 被引量:51
  • 9刘明,袁保宗,唐晓芳.证据理论k-NN规则中确定相似度参数的新方法[J].电子学报,2005,33(4):766-768. 被引量:8
  • 10Vivencio D P,Hruschka E R,Nicoletti M C. Featureweighted k-nearest neighbor classifier[A].Washington DC,USA:IEEE Communications Society,2007.481-486.

二级参考文献39

  • 1J B Tenenbaum, V de Silva, et al. A global geometric framework for nonlinear dimensionality reduction[ J ]. Science, 2000,290(5500):2319 - 2323.
  • 2S T Roweis, L K Saul. Nonlinear dimensionality reduction by locally linear embedding [ J ]. Science, 2000, 290 (5500) : 2323 - 2326.
  • 3M Belkin, P Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering [ A ]. Neural Information Processing Systems 14[ C]. Cambridge, MA: MIT Press, 2002, 585 - 591.
  • 4M H Law, N Zhang, et al. Nonlinear manifold learning for data stream[ A]. Proceedings of SIAM Data Mining[ C]. Florida: Orlando, 2004.33 - 44.
  • 5X He,S Yan,et al. Face recognition using Laplacianfaces[ J]. IEEE Transactions on Pattem Analysis and Machine Intelligence, 2005,27(3) : 328 - 340.
  • 6L K Saul, S T Rowels. Think globally, fit locally: unsupervised leaming of low dimensional manifolds[J ]. Journal of Machine Learning Research,2003,4(Jun): 119- 155.
  • 7Olga Kouropteva, Oleg Okun et al. Incremental locally linear embedding algorithm[ J ]. Pattern Recognition, 2005,38 (10) : 1764- 1767.
  • 8G H Golub, C F Van Loan. Matrix Computations[ M]. Baltimore: Johns Hopkins University Press, 1996.422 - 424.
  • 9Data sets for nonlinear dimensionality reduction[ OL ] http://isomap, stanford, edu/datasets, html.
  • 10COVER T M,HART P E. Nearest neighbor pattern classification [J]. In Trans IEEE Inform Theory, 1967,IT- 13:21 - 27.?A

共引文献68

同被引文献46

  • 1陈振洲,李磊,姚正安.基于SVM的特征加权KNN算法[J].中山大学学报(自然科学版),2005,44(1):17-20. 被引量:51
  • 2奉国和.自动文本分类技术研究[J].情报杂志,2007,26(12):108-111. 被引量:12
  • 3COVER T, HART P. Nearest neighbor pattern classification[ J ]. IEEE Transactions on Information Theory, 1967, 13(1) : 21-27.
  • 4LI Juan. TKNN: an improved KNN algorithm based on tree structure [ C ]//2011 Seventh International Conference on Computational Intelligence and Security (CIS). Sanya, Chi- na, 2011: 1390-1394.
  • 5WEINBERGER K Q, SAUL L K. Distance metric learning for large margin nearest neighbor classification[ J]. The Jour- nal of Machine Learning Research, 2009 (10) : 207-244.
  • 6DUDANI S A. The distance-weighted k-nearest-neighbor rule[ J ]. IEEE Transactions on Systems, Man, and Cyber- netics, 1976, SMC-6(4) : 325-327.
  • 7ZENG Yong, YANG Yupu, ZHAO Liang. Pseudo nearest neighbor rule for pattern classification [ J ]. Expert Systems with Applications, 2009, 36(2): 3587-3595.
  • 8WANG B, ZENG Yong, YANG Yupu. Generalized nearest neighbor rule for pattern classification [ C ]// 7th World Congress on Intelligent Control and Automation. Chongqing, China, 2008: 8465-8470.
  • 9MrrANI Y, HAMAMOTO Y. A local rrean-based nenparametfic classifier[J]. Pattern Recognition I_ettets, 2115, 27(10): 1151-1159.
  • 10S_NCHEZ J S, PLA F, FERRI F J. On the use of neigh- bourhood-based non-parametric classifiers [ J ]. Pattern Rec- ognition Letters, 1997, 18(11/12/13): 1179-1186.

引证文献6

二级引证文献20

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部