ICAN: Introspective Convolutional Attention Network for Semantic Text Classification

被引:2
|
作者
Mondal, Sounak [1 ]
Modi, Suraj [1 ]
Garg, Sakshi [1 ]
Das, Dhruva [1 ]
Mukherjee, Siddhartha [1 ]
机构
[1] Samsung R&D Inst, Bangalore 560037, Karnataka, India
关键词
D O I
10.1109/ICSC.2020.00031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semantic text classification involves a deep understanding of natural language by going beyond mere syntactic information and capturing complex semantic properties like synonymy, polysemy and negation. We propose a novel attention mechanism called Introspective Semantic Attention embedded within a cascaded CNN architecture. We call our network Introspective Convolutional Attention Network (ICAN). In addition to extracting semantic information using convolution operations, ICAN derives semantic attention from its primary convolutional features instead of using a separate attention module. We also introduce a novel hybrid pooling strategy for our architecture which aids in preserving pertinent information encapsulated within a sentence, while discarding meaningless noise. Our architecture, while light-weight and efficient, promises high accuracy with respect to state of the art architectures - making it ideal for embedded systems and commercial servers alike.
引用
收藏
页码:158 / 161
页数:4
相关论文
共 50 条
  • [21] Flexible asymmetric convolutional attention network for LiDAR semantic
    Gan, Jianwang
    Zhang, Guoying
    Kou, Kangkang
    Xiong, Yijing
    APPLIED INTELLIGENCE, 2024, 54 (08) : 6718 - 6737
  • [22] Fully convolutional network with attention modules for semantic segmentation
    Yunjia Huang
    Haixia Xu
    Signal, Image and Video Processing, 2021, 15 : 1031 - 1039
  • [23] A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification
    Zheng, Jin
    Zheng, Limin
    IEEE ACCESS, 2019, 7 : 106673 - 106685
  • [24] Fusion of heterogeneous attention mechanisms in multi-view convolutional neural network for text classification
    Liang, Yunji
    Li, Huihui
    Guo, Bin
    Yu, Zhiwen
    Zheng, Xiaolong
    Samtani, Sagar
    Zeng, Daniel D.
    INFORMATION SCIENCES, 2021, 548 : 295 - 312
  • [25] Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
    Jia X.
    Wang L.
    PeerJ Computer Science, 2022, 7
  • [26] Deep Pyramid Convolutional Neural Network Integrated with Self-attention Mechanism and Highway Network for Text Classification
    Li, Xuewei
    Ning, Hongyun
    4TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE APPLICATIONS AND TECHNOLOGIES (AIAAT 2020), 2020, 1642
  • [27] Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
    Jia, Xudong
    Wang, Li
    PEERJ COMPUTER SCIENCE, 2022, 8
  • [28] A Heterogeneous Directed Graph Attention Network for inductive text classification using multilevel semantic embeddings
    Lin, Mu
    Wang, Tao
    Zhu, Yifan
    Li, Xiaobo
    Zhou, Xin
    Wang, Weiping
    KNOWLEDGE-BASED SYSTEMS, 2024, 295
  • [29] Bidirectional LSTM with attention mechanism and convolutional layer for text classification
    Liu, Gang
    Guo, Jiabao
    NEUROCOMPUTING, 2019, 337 : 325 - 338
  • [30] Recurrent Attention Capsule Network for Text Classification
    Guan, Huanmei
    Liu, Jun
    Wu, Yujia
    Li, Ni
    2019 6TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING (ICISCE 2019), 2019, : 444 - 448