期刊文献+

基于主成分分析与局部二值模式的高光谱图像分类 被引量:14

Hyperspectral Image Classification Based on Principal Component Analysis and Local Binary Patterns
原文传递
导出
摘要 提出了两种基于主成分分析与局部二值模式的高光谱图像分类算法。利用主成分分析去除高光谱图像的谱间冗余信息,对降维后的图像利用局部二值模式进行空间纹理特征分析,采用稀疏表示分类和支持向量机分别对提取的特征进行分类。其通过将主成分分析与局部二值模式相结合对高光谱图像进行特征提取,保证了高光谱图像的谱间冗余的有效去除,同时保护了高光谱图像的空间局部邻域信息,因此,此类算法不但能充分挖掘高光谱图像的谱间-空间特征,在较大程度上提高分类精度和Kappa系数,而且在高斯噪声环境中和小样本情况下也具有良好的分类性能。 Two kinds of hyperspectral image classification algorithms based on principal component analysis and local binary patterns are proposed. The principal component analysis is employed to reduce the redundant information in spectral domain. Following that, the local binary patterns are studied to analyze the spatial texture features. And the sparse presentation classification and support vector machine are used for a classification of extracted results, respectively. Combining the principal component analysis with the local binary patterns for extracting the features of hyperspectral image, we ensure that the spectral redundant information is reduced effectively, and the spatial local neighborhood information is protected. Hence, the proposed algorithms can not only sufficiently excavate spectral-spatial features of hyperspectral image for improving classification accuracy and Kappa coefficient, but also have outstanding classification performance in Gaussian noise environments and small- sample-size condition.
作者 叶珍 白璘
出处 《激光与光电子学进展》 CSCD 北大核心 2017年第11期133-142,共10页 Laser & Optoelectronics Progress
基金 国家自然科学基金(41601344 61601059 51407012) 中央高校基本科研业务费专项资金(310832171006 310832163402 310832161004)
关键词 图像处理 高光谱图像分类 主成分分析 局部二值模式 特征提取 image processing hyperspectral patterns feature extraction image classification principal component analysis local binary
  • 相关文献

参考文献5

二级参考文献47

  • 1Chan J C W and Paelinckx D. Evaluation of random forest and adaboost tree-based ensemble classification and spectral band selection for ecotope mapping using airborne hyperspectral imagery[J]. Remote Sensing of Environment, 2008, 112(6): 2999-3011.
  • 2Shahshahani B M and Landgrebe D A. The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon[J]. IEEE Transactions on Geoscience and Remote Sensing, 1994, 32(5) 1087-1095.
  • 3Breiman L. Random forests [J]. Machine Learning, 2001, 45(1): 5-32.
  • 4Wright J, Ma Y, Mairal J, et al. Sparse representations for computer vision and pattern recognition [J]. Proceedings of the IEEE, 2010, 98(6): 1031-1044.
  • 5Raina R, Battle A, Lee H, et al. Self-taught learning: transfer learning from unlabeled data[C]. International Conference on Machine Learning, Corvallis, 2007: 759-766.
  • 6Qiao Li-shan, Chen Song-can, and Tan Xiao-yang. Sparsity preserving projection with applications to face recognition [J] Pattern Recognition, 2010, 43(1): 331-341.
  • 7Han Ya-hong, Wu Fei, Zhuang Yue-ting, et al. Multi-label transfer learning with sparse representation[J]. IEEE Transactions on Circuits and Systems for Video Technology, 2010, 20(8): 1110-1121.
  • 8Aharon M, Elad M, and Bruckstein A. K-SVD: an algorithm for designing over-complete dictionaries for sparse representation [J]. IEEE Transactions on Signal Processing, 2006, 54(11): 4311-4322.
  • 9Mairal J, Bach F, Ponce J, ct al.. Online learning for matrix factorization and sparse coding [J]. Journal of Machine Learning Research, 2010, 11(1): 19-60.
  • 10Breiman L. Bagging predictors [J]. Machine Learning, 1996, 24(2): 123-140.

共引文献275

同被引文献104

引证文献14

二级引证文献74

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部