Modeling Selective Feature Attention for Lightweight Text Matching

被引:0
|
作者
Zang, Jianxiang [1 ]
Liu, Hui [1 ]
机构
[1] Shanghai Univ Int Business & Econ, Sch Stat & Informat, Shanghai, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation-based Siamese networks have risen to popularity in lightweight text matching due to their low deployment and inference costs. While word-level attention mechanisms have been implemented within Siamese networks to improve performance, we propose Feature Attention (FA), a novel downstream block designed to enrich the modeling of dependencies among embedding features. Employing "squeeze-and-excitation" techniques, the FA block dynamically adjusts the emphasis on individual features, enabling the network to concentrate more on features that significantly contribute to the final classification. Building upon FA, we introduce a dynamic "selection" mechanism called Selective Feature Attention (SFA), which leverages a stacked BiGRU Inception structure. The SFA block facilitates multi-scale semantic extraction by traversing different stacked BiGRU layers, encouraging the network to selectively concentrate on semantic information and embedding features across varying levels of abstraction. Both the FA and SFA blocks offer a seamless integration capability with various Siamese networks, showcasing a plug-and-play characteristic. Experimental evaluations conducted across diverse text matching baselines and benchmarks underscore the indispensability of modeling feature attention and the superiority of the "selection" mechanism.
引用
收藏
页码:6624 / 6632
页数:9
相关论文
共 50 条
  • [11] Feature-Selective Attention in Healthy Old Age: A Selective Decline in Selective Attention?
    Quigley, Cliodhna
    Mueller, Matthias M.
    JOURNAL OF NEUROSCIENCE, 2014, 34 (07): : 2471 - 2476
  • [12] AGING, SELECTIVE ATTENTION, AND FEATURE INTEGRATION
    PLUDE, DJ
    DOUSSARDROOSEVELT, JA
    PSYCHOLOGY AND AGING, 1989, 4 (01) : 98 - 105
  • [13] Selective Tuning: Feature binding through selective attention
    Rothenstein, Albert L.
    Tsotsos, John K.
    ARTIFICIAL NEURAL NETWORKS - ICANN 2006, PT 2, 2006, 4132 : 548 - 557
  • [14] FmCFA: a feature matching method for critical feature attention in multimodal images
    Liao, Yun
    Wu, Xuning
    Liu, Junhui
    Liu, Peiyu
    Pan, Zhixuan
    Duan, Qing
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [15] A Multi-level Attention Model for Text Matching
    Sun, Qiang
    Wu, Yue
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I, 2018, 11139 : 142 - 153
  • [16] Fusion layer attention for image-text matching
    Wang, Depeng
    Wang, Liejun
    Song, Shiji
    Huang, Gao
    Guo, Yuchen
    Cheng, Shuli
    Ao, Naixiang
    Du, Anyu
    NEUROCOMPUTING, 2021, 442 : 249 - 259
  • [17] Stacked Cross Attention for Image-Text Matching
    Lee, Kuang-Huei
    Chen, Xi
    Hua, Gang
    Hu, Houdong
    He, Xiaodong
    COMPUTER VISION - ECCV 2018, PT IV, 2018, 11208 : 212 - 228
  • [18] Feature Matching in Time Series Modeling
    Xia, Yingcun
    Tong, Howell
    STATISTICAL SCIENCE, 2011, 26 (01) : 21 - 46
  • [19] Learning Implicit Text Generation via Feature Matching
    Padhi, Inkit
    Dognin, Pierre
    Bai, Ke
    dos Santos, Cicero Nogueira
    Chenthamarakshan, Vijil
    Mroueh, Youssef
    Das, Payel
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3855 - 3863
  • [20] SPATIAL CHARACTERISTICS OF SELECTIVE ATTENTION IN LETTER MATCHING
    SKELTON, JM
    ERIKSEN, CW
    BULLETIN OF THE PSYCHONOMIC SOCIETY, 1976, 7 (02) : 136 - 138