BA-Net: Bridge Attention for Deep Convolutional Neural Networks

被引:9
|
作者
Zhao, Yue [1 ,2 ]
Chen, Junzhou [1 ,2 ]
Zhang, Zirui [1 ,2 ]
Zhang, Ronghui [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Shenzhen Campus,66 Gongchang Rd, Shenzhen 518107, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Guangdong Prov Key Lab Fire Sci & Intelligent Eme, Guangzhou 510006, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Channel attention mechanism; Deep neural networks architecture; Networks optimization;
D O I
10.1007/978-3-031-19803-8_18
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In attention mechanism research, most existing methods are hard to utilize well the information of the neural network with high computing efficiency due to heavy feature compression in the attention layer. This paper proposes a simple and general approach named Bridge Attention to address this issue. As a new idea, BA-Net straightforwardly integrates features from previous layers and effectively promotes information interchange. Only simple strategies are employed for the model implementation, similar to the SENet. Moreover, after extensively investigating the effectiveness of different previous features, we discovered a simple and exciting insight that bridging all the convolution outputs inside each block with BN can obtain better attention to enhance the performance of neural networks. BA-Net is effective, stable, and easy to use. A comprehensive evaluation of computer vision tasks demonstrates that the proposed approach achieves better performance than the existing channel attention methods regarding accuracy and computing efficiency. The source code is available at https://github.com/zhaoy376/Bridge-Attention.
引用
收藏
页码:297 / 312
页数:16
相关论文
共 50 条
  • [1] SA-NET: SHUFFLE ATTENTION FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Zhang, Qing-Long
    Yang, Yu-Bin
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2235 - 2239
  • [2] Rega-Net: Retina Gabor Attention for Deep Convolutional Neural Networks
    Bao, Chun
    Cao, Jie
    Ning, Yaqian
    Cheng, Yang
    Hao, Qun
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [3] BA-Net: Brightness prior guided attention network for colonic polyp segmentation
    Xia, Haiying
    Qin, Yilin
    Tan, Yumei
    Song, Shuxiang
    BIOCYBERNETICS AND BIOMEDICAL ENGINEERING, 2023, 43 (03) : 603 - 615
  • [4] Spatial Channel Attention for Deep Convolutional Neural Networks
    Liu, Tonglai
    Luo, Ronghai
    Xu, Longqin
    Feng, Dachun
    Cao, Liang
    Liu, Shuangyin
    Guo, Jianjun
    MATHEMATICS, 2022, 10 (10)
  • [5] Spatial Pyramid Attention for Deep Convolutional Neural Networks
    Ma, Xu
    Guo, Jingda
    Sansom, Andrew
    McGuire, Mara
    Kalaani, Andrew
    Chen, Qi
    Tang, Sihai
    Yang, Qing
    Fu, Song
    IEEE TRANSACTIONS ON MULTIMEDIA, 2021, 23 : 3048 - 3058
  • [6] Multiscale Hybrid Convolutional Deep Neural Networks with Channel Attention
    Yang, Hua
    Yang, Ming
    He, Bitao
    Qin, Tao
    Yang, Jing
    ENTROPY, 2022, 24 (09)
  • [7] Deep Convolutional Neural Networks
    Gonzalez, Rafael C.
    IEEE SIGNAL PROCESSING MAGAZINE, 2018, 35 (06) : 79 - 87
  • [8] An Attention Module for Convolutional Neural Networks
    Zhu, Baozhou
    Hofstee, Peter
    Lee, Jinho
    Al-Ars, Zaid
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 167 - 178
  • [9] Reparameterized attention for convolutional neural networks
    Wu, Yiming
    Li, Ruixiang
    Yu, Yunlong
    Li, Xi
    PATTERN RECOGNITION LETTERS, 2022, 164 : 89 - 95
  • [10] DeepSAR-Net: Deep Convolutional Neural Networks for SAR Target Recognition
    Li, Yang
    Wang, Jiabao
    Xu, Yulong
    Li, Hang
    Miao, Zhuang
    Zhang, Yafei
    2017 IEEE 2ND INTERNATIONAL CONFERENCE ON BIG DATA ANALYSIS (ICBDA), 2017, : 740 - 743