Neighborhood Attention Transformer

被引:115
|
作者
Hassani, Ali [1 ,2 ]
Walton, Steven [1 ,2 ]
Li, Jiachen [1 ,2 ]
Li, Shen [4 ]
Shi, Humphrey [1 ,2 ,3 ]
机构
[1] Univ Oregon, SHI Labs, Eugene, OR 97403 USA
[2] UIUC, Champaign, IL 61801 USA
[3] Picsart AI Res PAIR, New York, NY USA
[4] Meta Facebook AI, Menlo Pk, CA USA
关键词
D O I
10.1109/CVPR52729.2023.00599
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present Neighborhood Attention (NA), the first efficient and scalable sliding window attention mechanism for vision. NA is a pixel-wise operation, localizing self attention (SA) to the nearest neighboring pixels, and therefore enjoys a linear time and space complexity compared to the quadratic complexity of SA. The sliding window pattern allows NA's receptive field to grow without needing extra pixel shifts, and preserves translational equivariance, unlike Swin Transformer's Window Self Attention (WSA). We develop NATTEN (Neighborhood Attention Extension), a Python package with efficient C++ and CUDA kernels, which allows NA to run up to 40% faster than Swin's WSA while using up to 25% less memory. We further present Neighborhood Attention Transformer (NAT), a new hierarchical transformer design based on NA that boosts image classification and downstream vision performance. Experimental results on NAT are competitive; NAT-Tiny reaches 83.2% top-1 accuracy on ImageNet, 51.4% mAP on MS-COCO and 48.4% mIoU on ADE20K, which is 1.9% ImageNet accuracy, 1.0% COCO mAP, and 2.6% ADE20K mIoU improvement over a Swin model with similar size. To support more research based on sliding window attention, we open source our project and release our checkpoints.
引用
收藏
页码:6185 / 6194
页数:10
相关论文
共 50 条
  • [31] Captioning Transformer with Stacked Attention Modules
    Zhu, Xinxin
    Li, Lixiang
    Liu, Jing
    Peng, Haipeng
    Niu, Xinxin
    APPLIED SCIENCES-BASEL, 2018, 8 (05):
  • [32] Transformer Interpretability Beyond Attention Visualization
    Chefer, Hila
    Gur, Shir
    Wolf, Lior
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 782 - 791
  • [33] Consideration of the Impacts of a Smart Neighborhood Load on Transformer Aging
    Paterakis, Nikolaos G.
    Pappi, Iliana N.
    Erdinc, Ozan
    Godina, Radu
    Rodrigues, Eduardo M. G.
    Catalao, Joao P. S.
    IEEE TRANSACTIONS ON SMART GRID, 2016, 7 (06) : 2793 - 2802
  • [34] Extending the power line LAN up to the neighborhood transformer
    Abad, J
    Badenes, A
    Blasco, J
    Carreras, J
    Dominguez, V
    Gomez, C
    Iranzo, S
    Riveiro, JC
    Ruiz, D
    Torres, LM
    Comabella, J
    IEEE COMMUNICATIONS MAGAZINE, 2003, 41 (04) : 64 - 70
  • [35] Spectral Spatial Neighborhood Attention Transformer for Hyperspectral Image Classification: Transformateur d’attention de voisinage spatial-spectral pour la classification d’images hyperspectrales
    Arshad, Tahir
    Zhang, Junping
    Anyembe, Shibwabo C
    Mehmood, Aamir
    Canadian Journal of Remote Sensing, 2024, 50 (01)
  • [36] Neighborhood Interaction Attention Network for Link Prediction
    Wang, Zhitao
    Lei, Yu
    Li, Wenjie
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2153 - 2156
  • [37] LNGAT: local neighborhood graph attention network
    Sun, Yukuan
    Ma, Haoran
    Bo, Young-Bae
    Wang, Jianming
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (05)
  • [38] SGSAFormer: Spike Gated Self-Attention Transformer and Temporal Attention
    Gao, Shouwei
    Qin, Yu
    Zhu, Ruixin
    Zhao, Zirui
    Zhou, Hao
    Zhu, Zihao
    ELECTRONICS, 2025, 14 (01):
  • [39] Attention Head Interactive Dual Attention Transformer for Hyperspectral Image Classification
    Shi, Cuiping
    Yue, Shuheng
    Wang, Liguo
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 1
  • [40] Transformer Dissection: An Unified Understanding for Transformer's Attention via the Lens of Kernel
    Tsai, Yao-Hung Hubert
    Bai, Shaojie
    Yamada, Makoto
    Morency, Louis-Philippe
    Salakhutdinov, Ruslan
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4344 - 4353