Central Attention Mechanism for Convolutional Neural Networks

被引:0
|
作者
Geng, Y.X. [1 ]
Wang, L. [2 ]
Wang, Z.Y. [3 ]
Wang, Y.G. [1 ]
机构
[1] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[2] School of Computer Science and Software Engineering, University of Science and Technology Liaoning, Anshan,114051, China
[3] Automation Design Institute, Metallurgical Engineering Technology Co., Ltd., Dalian,116000, China
关键词
Tensors;
D O I
暂无
中图分类号
学科分类号
摘要
Model performance has been significantly enhanced by channel attention. The average pooling procedure creates skewness, lowering the performance of the network architecture. In the channel attention approach, average pooling is used to collect feature information to provide representative values. By leveraging the central limit theorem, we hypothesize that the strip-shaped average pooling operation will generate a one-dimensional tensor by considering the spatial position information of the feature map. The resulting tensor, obtained through average pooling, serves as the representative value for the features, mitigating skewness during the process. By incorporating the concept of the central limit theorem into the channel attention operation process, this study introduces a novel attention mechanism known as theCentral Attention Mechanism (CAM). Instead of directly using average pooling to generate channel representative values, the central attention approach employs star-stripe average pooling to normalize multiple feature representative values into a single representative value. In this way, strip-shaped average pooling can be utilized to collect data and generate a one-dimensional tensor, while star-stripe average pooling can provide feature representative values based on different spatial directions. To generate channel attention for the complementary input features, the activation of the feature representation value is performed for each channel. Our attention approach is flexible and can be seamlessly incorporated into various traditional network structures. Through rigorous testing, we demonstrate the effectiveness of our attention strategy, which can be applied to a wide range of computer vision applications and outperforms previous attention techniques. © (2024), (International Association of Engineers). All rights reserved.
引用
收藏
页码:1642 / 1648
相关论文
共 50 条
  • [41] Real world image tampering localization combining the self-attention mechanism and convolutional neural networks
    Zhong H.
    Bian S.
    Wang C.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (01): : 135 - 146
  • [42] Temporal Convolutional Attention Neural Networks for Time Series Forecasting
    Lin, Yang
    Koprinska, Irena
    Rana, Mashud
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [43] Incorporating word attention with convolutional neural networks for abstractive summarization
    Chengzhe Yuan
    Zhifeng Bao
    Mark Sanderson
    Yong Tang
    World Wide Web, 2020, 23 : 267 - 287
  • [44] Dense & Attention Convolutional Neural Networks for Toe Walking Recognition
    Chen, Junde
    Soangra, Rahul
    Grant-Beuttler, Marybeth
    Nanehkaran, Y. A.
    Wen, Yuxin
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 2235 - 2245
  • [45] Attention-based Convolutional Neural Networks for Sentence Classification
    Zhao, Zhiwei
    Wu, Youzheng
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 705 - 709
  • [46] A New Cyclic Spatial Attention Module for Convolutional Neural Networks
    Li Daihui
    Zeng Shangyou
    Li Wenhui
    Yang Lei
    2019 IEEE 11TH INTERNATIONAL CONFERENCE ON COMMUNICATION SOFTWARE AND NETWORKS (ICCSN 2019), 2019, : 607 - 611
  • [47] Incorporating word attention with convolutional neural networks for abstractive summarization
    Yuan, Chengzhe
    Bao, Zhifeng
    Sanderson, Mark
    Tang, Yong
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (01): : 267 - 287
  • [48] Fire Detection based on Convolutional Neural Networks with Channel Attention
    Zhang, Xiaobo
    Qian, Kun
    Jing, Kaihe
    Yang, Jianwei
    Yu, Hai
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 3080 - 3085
  • [49] Multiscale Convolutional Neural Networks with Attention for Plant Species Recognition
    Wang, Xianfeng
    Zhang, Chuanlei
    Zhang, Shanwen
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [50] Causal Discovery with Attention-Based Convolutional Neural Networks
    Nauta, Meike
    Bucur, Doina
    Seifert, Christin
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2019, 1 (01):