期刊文献+

A Pixel–Channel Hybrid Attention Model for Image Processing 被引量:5

原文传递
导出
摘要 In the field of image processing,better results can often be achieved through the deepening of neural network layers involving considerably more parameters.In image classification,improving classification accuracy without introducing too many parameters remains a challenge.As for image conversion,the use of the conversion model of the generative adversarial network often produces semantic artifacts,resulting in images with lower quality.Thus,to address the above problems,a new type of attention module is proposed in this paper for the first time.This proposed approach uses the pixel-channel hybrid attention(PCHA) mechanism,which combines the attention information of the pixel and channel domains.The comparative results of using different attention modules on multiple-image data verify the superiority of the PCHA module in performing classification tasks.For image conversion,we propose a skip structure(S-PCHA model) in the up-and down-sampling processes based on the PCHA model.The proposed model can help the algorithm identify the most distinctive semantic object in a given image,as this structure effectively realizes the intercommunication of encoder and decoder information.Furthermore,the results showed that the attention model could establish a more realistic mapping from the source domain to the target domain in the image conversion algorithm,thus improving the quality of the image generated by the conversion model.
出处 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2022年第5期804-816,共13页 清华大学学报(自然科学版(英文版)
基金 supported by the National Natural Science Foundation of China (No. 61976141) the Natural Science Foundation of Hebei Province (Nos. F2018201096 and F2018201115) the Natural Science Foundation of Guangdong Province (No. 2018A0303130026) the Key Foundation of the Education Department of Hebei Province (No. ZD2019021)。
  • 相关文献

参考文献2

二级参考文献4

共引文献21

同被引文献23

引证文献5

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部