《计算机应用研究》|Application Research of Computers

用于图像超分辨的密集跳跃注意连接网络

Densely channel attention skip connection network for image super-resolution

免费全文下载 (已被下载 次)  
获取PDF全文
作者 吴荣贵,蒋平
机构 1.中国科学院光电技术研究所,成都 610209;2.中国科学院大学,北京 100049
统计 摘要被查看 次,已被下载
文章编号 1001-3695(2020)12-055-3788-04
DOI 10.19734/j.issn.1001-3695.2019.05.0240
摘要 为解决现有基于深度学习的超分辨算法模型没有充分利用各个层次的特征信息导致重建精度不高、参数量大的问题,提出了一个内外双重密集连接结构——密集跳跃注意连接网络。内层结构中,对原始密集级联结构进行改进,提出了通道可分密集级联块;外层结构采用密集残差连接结合注意力机制将由密集块提取的特征进行融合,从而达到更少卷积层、更高精度的效果。在多个基准数据集上测试,提出的网络较其他网络层数体量相近的算法精度更高、参数量更少。
关键词 深度学习; 图像超分辨; 密集连接; 注意力机制
基金项目
本文URL http://www.arocmag.com/article/01-2020-12-055.html
英文标题 Densely channel attention skip connection network for image super-resolution
作者英文名 Wu Ronggui, Jiang Ping
机构英文名 1.Institute of Optics & Electronics,Chinese Academy of Sciences,Chengdu 610209,China;2.University of Chinese Academy of Sciences,Beijing 100049,China
英文摘要 In order to solve the problem that the existing super-resolution algorithm based on deep learning didn't make full use of the feature information of each level, resulting in low reconstruction accuracy and large parameter quantity, this paper proposed a double dense connection structure named densely channel attention skip connection network. In the inner structure of the network, it improved the original dense cascade block to generate a channel separable dense cascade block. The outer structure adopted a densely residual connection and attention mechanism to fuse the features extracted by the dense block to achieve the goal that less convolution layer and higher precision effect. This paper tested the network models on several benchmark datasets. The results show the proposed model has higher accuracy and fewer parameters than the other models.
英文关键词 deep learning; image super-resolution; dense skip connection; attention mechanism
参考文献 查看稿件参考文献
 
收稿日期 2019/5/24
修回日期 2019/7/13
页码 3788-3791
中图分类号 TP391.41
文献标志码 A