Hierarchical Multi-Granularity Attention- Based Hybrid Neural Network for Text Classification

被引:6
|
作者
Liu Z. [1 ]
Lu C. [2 ]
Huang H. [2 ]
Lyu S. [2 ]
Tao Z. [3 ,4 ]
机构
[1] School of Information Management for Law, China University of Political Science and Law, Beijing
[2] School of Computer Science, University of Science and Technology of China, Hefei
[3] Division of Life Sciences and Medicine, First Affiliated Hospital of USTC, University of Science and Technology of China, Hefei
[4] Anhui Provincial Cancer Hospital, Hefei
关键词
Attention mechanism; convolutional neural network; multichannel; text classification;
D O I
10.1109/ACCESS.2020.3016727
中图分类号
学科分类号
摘要
Neural network-based approaches have become the driven forces for Natural Language Processing (NLP) tasks. Conventionally, there are two mainstream neural architectures for NLP tasks: the recurrent neural network (RNN) and the convolution neural network (ConvNet). RNNs are good at modeling long-term dependencies over input texts, but preclude parallel computation. ConvNets do not have memory capability and it has to model sequential data as un-ordered features. Therefore, ConvNets fail to learn sequential dependencies over the input texts, but it is able to carry out high-efficient parallel computation. As each neural architecture, such as RNN and ConvNets, has its own pro and con, integration of different architectures is assumed to be able to enrich the semantic representation of texts, thus enhance the performance of NLP tasks. However, few investigation explores the reconciliation of these seemingly incompatible architectures. To address this issue, we propose a hybrid architecture based on a novel hierarchical multi-granularity attention mechanism, named Multi-granularity Attention-based Hybrid Neural Network (MahNN). The attention mechanism is to assign different weights to different parts of the input sequence to increase the computation efficiency and performance of neural models. In MahNN, two types of attentions are introduced: the syntactical attention and the semantical attention. The syntactical attention computes the importance of the syntactic elements (such as words or sentence) at the lower symbolic level and the semantical attention is used to compute the importance of the embedded space dimension corresponding to the upper latent semantics. We adopt the text classification as an exemplifying way to illustrate the ability of MahNN to understand texts. The experimental results on a variety of datasets demonstrate that MahNN outperforms most of the state-of-the-arts for text classification. © 2013 IEEE.
引用
收藏
页码:149362 / 149371
页数:9
相关论文
共 50 条
  • [21] Improving influenza surveillance based on multi-granularity deep spatiotemporal neural network
    Wang, Ruxin
    Wu, Hongyan
    Wu, Yongsheng
    Zheng, Jing
    Li, Ye
    COMPUTERS IN BIOLOGY AND MEDICINE, 2021, 134
  • [22] Multi-Granularity Neural Sentence Model for Measuring Short Text Similarity
    Huang, Jiangping
    Yao, Shuxin
    Lyu, Chen
    Ji, Donghong
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2017), PT I, 2017, 10177 : 439 - 455
  • [23] Text Classification Based on Hybrid Neural Network
    Liu, Yapei
    Ma, Jianhong
    Tao, Yongcai
    Shi, Lei
    Wei, Lin
    Li, Linna
    2020 IEEE 23RD INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE 2020), 2020, : 24 - 29
  • [24] Multi-scale network via progressive multi-granularity attention for fine-grained visual classification
    An, Chen
    Wang, Xiaodong
    Wei, Zhiqiang
    Zhang, Ke
    Huang, Lei
    APPLIED SOFT COMPUTING, 2023, 146
  • [25] Wavelet attention-based implicit multi-granularity super-resolution network
    Chen Boying
    Shi Jie
    Complex & Intelligent Systems, 2025, 11 (5)
  • [26] LA-HCN: Label-based Attention for Hierarchical Multi-label Text Classification Neural Network
    Zhang, Xinyi
    Xu, Jiahao
    Soh, Charlie
    Chen, Lihui
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 187
  • [27] A Neural Network Based Text Classification with Attention Mechanism
    Lu SiChen
    PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 333 - 338
  • [28] Fuzzy Logic Guided Deep Neural Network with Multi-granularity
    Zhou T.
    Ding W.
    Huang J.
    Ju H.
    Jiang S.
    Wang H.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2023, 36 (09): : 778 - 792
  • [29] Text Sentiment Analysis Based on Multi-Granularity Joint Solution
    Fang, Xianghui
    Wang, Guoyin
    Liu, Qun
    2018 IEEE 3RD INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYSIS (ICCCBDA), 2018, : 315 - 321
  • [30] Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering
    Wang, Wei
    Yan, Ming
    Wu, Chen
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1705 - 1714