Attention-based BiLSTM fused CNN with gating mechanism model for Chinese long text classi fi cation

被引:100
|
作者
Deng, Jianfeng [1 ]
Cheng, Lianglun [1 ,2 ]
Wang, Zhuowei [2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
[2] Guangdong Univ Technol, Sch Comp, Guangzhou 510006, Peoples R China
来源
关键词
Attention mechanism; BiLSTM; CNN; Gating mechanism; Text classification;
D O I
10.1016/j.csl.2020.101182
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks have been widely used in the field of text classification, and have achieved good results on various Chinese datasets. However, for long text classification, there are a lot of redundant information in text data, and some of the redundant information may involve other topic information, which makes long text classification challenging. To solve the above problems, this paper proposes a new text classification model, called attention-based BiLSTM fused CNN with gating mechanism(ABLG-CNN). In ABLG-CNN, word2vec is used to train word vector representation. The attention mechanism is used to calculate context vector of words to derive keyword information. Bidirectional long short-term memory network (BiLSTM) captures context features. Based on this, convolutional neural network(CNN) captures topic salient features. In view of the possible existence of sentences involving other topic information in long texts, a gating mechanism is introduced to assign weights to BiLSTM and CNN output features to obtain text fusion features that are favorable for classification. ABLG-CNN can capture text context semantics and local phrase features, and perform experimental verification on two long text news datasets. The experimental results show that ABLG-CNN's classification performance is better than other latest text classification methods. (c) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 30 条
  • [1] Chinese News Text Classification based on Attention-based CNN-BiLSTM
    Wang, Meng
    Cai, Qiong
    Wang, Liya
    Li, Jun
    Wang, Xiaoke
    MIPPR 2019: PATTERN RECOGNITION AND COMPUTER VISION, 2020, 11430
  • [2] An attention-based CNN-BiLSTM model for depression detection on social media text
    Thekkekara, Joel Philip
    Yongchareon, Sira
    Liesaputra, Veronica
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [3] BiLSTM-Attention-CNN Model Based on ISSA Optimization for Cyberbullying Detection in Chinese Text
    Fan, Wenting
    INFORMATION TECHNOLOGY AND CONTROL, 2024, 53 (03):
  • [4] BiLSTM Model With Attention Mechanism for Sentiment Classification on Chinese Mixed Text Comments
    Li, Xiaoyan
    Raga, Rodolfo C.
    IEEE ACCESS, 2023, 11 : 26199 - 26210
  • [5] Forecasting Teleconsultation Demand Using an Ensemble CNN Attention-Based BILSTM Model with Additional Variables
    Chen, Wenjia
    Li, Jinlin
    HEALTHCARE, 2021, 9 (08)
  • [6] CCBLA: a Lightweight Phishing Detection Model Based on CNN, BiLSTM, and Attention Mechanism
    Zhu, Erzhou
    Yuan, Qixiang
    Chen, Zhile
    Li, Xuejian
    Fang, Xianyong
    COGNITIVE COMPUTATION, 2023, 15 (04) : 1320 - 1333
  • [7] CCBLA: a Lightweight Phishing Detection Model Based on CNN, BiLSTM, and Attention Mechanism
    Erzhou Zhu
    Qixiang Yuan
    Zhile Chen
    Xuejian Li
    Xianyong Fang
    Cognitive Computation, 2023, 15 : 1320 - 1333
  • [8] CRAN: A Hybrid CNN-RNN Attention-Based Model for Text Classification
    Guo, Long
    Zhang, Dongxiang
    Wang, Lei
    Wang, Han
    Cui, Bin
    CONCEPTUAL MODELING, ER 2018, 2018, 11157 : 571 - 585
  • [9] An Attention-Based BiLSTM-CRF Model for Chinese Clinic Named Entity Recognition
    Wu, Guohua
    Tang, Guangen
    Wang, Zhongru
    Zhang, Zhen
    Wang, Zhen
    IEEE ACCESS, 2019, 7 (113942-113949) : 113942 - 113949
  • [10] A Radical-Aware Attention-Based Model for Chinese Text Classification
    Tao, Hanqing
    Tong, Shiwei
    Zhao, Hongke
    Xu, Tong
    Jin, Binbin
    Liu, Qi
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5125 - 5132