Hierarchical Multi-Granularity Attention- Based Hybrid Neural Network for Text Classification

被引:6
|
作者
Liu Z. [1 ]
Lu C. [2 ]
Huang H. [2 ]
Lyu S. [2 ]
Tao Z. [3 ,4 ]
机构
[1] School of Information Management for Law, China University of Political Science and Law, Beijing
[2] School of Computer Science, University of Science and Technology of China, Hefei
[3] Division of Life Sciences and Medicine, First Affiliated Hospital of USTC, University of Science and Technology of China, Hefei
[4] Anhui Provincial Cancer Hospital, Hefei
关键词
Attention mechanism; convolutional neural network; multichannel; text classification;
D O I
10.1109/ACCESS.2020.3016727
中图分类号
学科分类号
摘要
Neural network-based approaches have become the driven forces for Natural Language Processing (NLP) tasks. Conventionally, there are two mainstream neural architectures for NLP tasks: the recurrent neural network (RNN) and the convolution neural network (ConvNet). RNNs are good at modeling long-term dependencies over input texts, but preclude parallel computation. ConvNets do not have memory capability and it has to model sequential data as un-ordered features. Therefore, ConvNets fail to learn sequential dependencies over the input texts, but it is able to carry out high-efficient parallel computation. As each neural architecture, such as RNN and ConvNets, has its own pro and con, integration of different architectures is assumed to be able to enrich the semantic representation of texts, thus enhance the performance of NLP tasks. However, few investigation explores the reconciliation of these seemingly incompatible architectures. To address this issue, we propose a hybrid architecture based on a novel hierarchical multi-granularity attention mechanism, named Multi-granularity Attention-based Hybrid Neural Network (MahNN). The attention mechanism is to assign different weights to different parts of the input sequence to increase the computation efficiency and performance of neural models. In MahNN, two types of attentions are introduced: the syntactical attention and the semantical attention. The syntactical attention computes the importance of the syntactic elements (such as words or sentence) at the lower symbolic level and the semantical attention is used to compute the importance of the embedded space dimension corresponding to the upper latent semantics. We adopt the text classification as an exemplifying way to illustrate the ability of MahNN to understand texts. The experimental results on a variety of datasets demonstrate that MahNN outperforms most of the state-of-the-arts for text classification. © 2013 IEEE.
引用
收藏
页码:149362 / 149371
页数:9
相关论文
共 50 条
  • [41] Uncertainty instructed multi-granularity decision for large-scale hierarchical classification
    Wang, Yu
    Hu, Qinghua
    Chen, Hao
    Qian, Yuhua
    INFORMATION SCIENCES, 2022, 586 : 644 - 661
  • [42] Hierarchical Multi-label Text Classification: An Attention-based Recurrent Network Approach
    Huang, Wei
    Chen, Enhong
    Liu, Qi
    Chen, Yuying
    Huang, Zai
    Liu, Yang
    Zhao, Zhou
    Zhang, Dan
    Wang, Shijin
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 1051 - 1060
  • [43] Disentangled Representations and Hierarchical Refinement of Multi-Granularity Features for Text-to-Image Synthesis
    Dong, Pei
    Wu, Lei
    Meng, Lei
    Meng, Xiangxu
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2022, 2022, : 268 - 276
  • [44] Multi-Granularity Attention Model for Group Recommendation
    Ji, Jianye
    Pei, Jiayan
    Lin, Shaochuan
    Zhou, Taotao
    He, Hengxu
    Jia, Jia
    Hu, Ning
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3973 - 3977
  • [45] Multi-Granularity Part Sampling Attention for Fine-Grained Visual Classification
    Wang, Jiahui
    Xu, Qin
    Jiang, Bo
    Luo, Bin
    Tang, Jinhui
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 4529 - 4542
  • [46] Efficient multi-granularity network for fine-grained image classification
    Jiabao Wang
    Yang Li
    Hang Li
    Xun Zhao
    Rui Zhang
    Zhuang Miao
    Journal of Real-Time Image Processing, 2022, 19 : 853 - 866
  • [47] DOCUMENT CLASSIFICATION BASED ON CONVOLUTIONAL NEURAL NETWORK AND HIERARCHICAL ATTENTION NETWORK
    Cheng, Y.
    Ye, Z.
    Wang, M.
    Zhang, Q.
    NEURAL NETWORK WORLD, 2019, 29 (02) : 83 - 98
  • [48] Multi-Granularity Ensemble Classification Algorithm Based on Attribute Representation
    Zhang Q.-H.
    Zhi X.-C.
    Wang G.-Y.
    Yang F.
    Xue F.-Z.
    Jisuanji Xuebao/Chinese Journal of Computers, 2022, 45 (08): : 1712 - 1729
  • [49] Efficient multi-granularity network for fine-grained image classification
    Wang, Jiabao
    Li, Yang
    Li, Hang
    Zhao, Xun
    Zhang, Rui
    Miao, Zhuang
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2022, 19 (05) : 853 - 866
  • [50] Topic-aware hierarchical multi-attention network for text classification
    Ye Jiang
    Yimin Wang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1863 - 1875