期刊文献+

PCTU-Net:用于高光谱图像的多尺度池化-Transformer协同解混网络

PCTU-Net: Multi-Scale Pooling and Transformer Collaborative Unmixing Network for Hyperspectral Images
下载PDF
导出
摘要 近年来,基于深度学习的高光谱解混技术越来越受到关注,并取得了重大进展。然而,仅依靠Transformer方法不足以有效捕获全局和细粒度信息,从而影响解混任务的准确性。为了充分利用高光谱图像中包含的信息,本研究探索了一个通过池化操作来加深网络并提取图像细节特征进而与Transformer协同作用于高光谱图像解混的网络,称为PCTU-Net。它端到端地充分学习了全局和局部信息,以实现更有效地解混。该网络包括两个核心模块:一个是多尺度池化模块,该模块由最大池化操作、条纹池化操作和平均池化操作组成;另一个是Transformer编码器,它包括了嵌入层、自注意力模块、线性层以及多层感知器。本研究在三个数据集(Samson、Apex和合成数据集)上广泛评估了PCTU-Net和其他六种高光谱解混方法。实验结果有力地表明,所提出的方法在精度方面优于其他方法,具有有效解决高光谱解混任务的潜力。 In recent years, deep learning-based hyperspectral unmixing techniques have gained increasing attention and made significant strides. However, relying solely on the Transformer approach falls short in effectively capturing both global and fine-grained information, thereby impacting the ac-curacy of unmixing tasks. To fully harness the information embedded in hyperspectral images, this study explores a network called PCTU-Net, which deepens the network and extracts image detail features through pooling operations, synergizing with the Transformer for hyperspectral image unmixing. PCTU-Net end-to-end learns both global and local information to achieve more effective unmixing. The network comprises two core modules: a multi-scale pooling module, consisting of max-pooling, stripe pooling, and average pooling operations, and a Transformer encoder, which in-cludes embedding layers, self-attention modules, linear layers, and multi-layer perceptron. This research extensively evaluates PCTU-Net and six other hyperspectral unmixing methods on three datasets (Samson, Apex, and Synthetic datasets). The experimental results robustly demonstrate that the proposed approach outperforms others in terms of accuracy, showcasing its potential for effectively addressing hyperspectral unmixing tasks.
出处 《建模与仿真》 2023年第6期5572-5584,共13页 Modeling and Simulation
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部