Research on News Text Classification Based on BERT-BiLSTM-TextCNN-Attention

被引:0
|
作者
Wang, Jia [1 ]
Li, Zongting [2 ]
Ma, Chenyang [2 ]
机构
[1] Dalian Polytech Univ, Dalian 116034, Liaoning, Peoples R China
[2] Dalian Polytech Univ, Sch Informat Sci & Engn, Dalian 116034, Liaoning, Peoples R China
关键词
Deep learning; text classification; natural language processing; neural network;
D O I
10.1145/3672919.3672973
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional machine learning models are difficult to capture complex features and contextual relationships. While a singular deep learning architecture surpasses machine learning in text processing, it falls short of encompassing the entirety of textual information [5]. Enter a novel approach: a news text classifier built upon BERT-BiLSTM-TextCNN-Attention. This model employs BERT's pre-trained language models to delve into text content. It then channels this data into a BiLSTM layer, capturing sequence nuances and long-term dependencies for comprehensive semantic insight. Following this, the output moves through a TextCNN layer, effectively capturing local semantic cues through convolution. The model culminates with attention mechanisms that highlight pivotal text attributes, refining feature vectors for the Softmax layer's classification. The experimentation utilized a subset of the THUCNews Chinese news text dataset. Results indicate that the BERT BiLSTM TextCNN Attention model achieved 96.48% accuracy, outperforming other benchmarks. This underscores its superiority in handling Chinese news text classification and validating its prowess in extracting deep semantic nuances and crucial local features from the text.
引用
收藏
页码:295 / 298
页数:4
相关论文
共 50 条
  • [21] TextCNN-based Text Classification for E-government
    Wu Suyan
    Su Entong
    Lei Binyang
    Wu Jiangrui
    2019 6TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING (ICISCE 2019), 2019, : 929 - 934
  • [22] Microblog Text Classification System Based on TextCNN and LSA Model
    Zhang, Weiyu
    Xu, Can
    2020 5TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE, COMPUTER TECHNOLOGY AND TRANSPORTATION (ISCTT 2020), 2020, : 469 - 474
  • [23] Academic News Text Classification Model Based on Attention Mechanism and RCNN
    Lin, Ronghua
    Fu, Chengzhou
    Mao, Chengjie
    Wei, Jingmin
    Li, Jianguo
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2018, 2019, 917 : 507 - 516
  • [24] Research on Internet Text Sentiment Classification Based on BERT and CNN-BiGRU
    Wei, Guoli
    2022 11TH INTERNATIONAL CONFERENCE ON COMMUNICATIONS, CIRCUITS AND SYSTEMS (ICCCAS 2022), 2022, : 285 - 289
  • [25] An R-Transformer_BiLSTM Model Based on Attention for Multi-label Text Classification
    Yaoyao Yan
    Fang’ai Liu
    Xuqiang Zhuang
    Jie Ju
    Neural Processing Letters, 2023, 55 : 1293 - 1316
  • [26] An R-Transformer_BiLSTM Model Based on Attention for Multi-label Text Classification
    Yan, Yaoyao
    Liu, Fang'ai
    Zhuang, Xuqiang
    Ju, Jie
    NEURAL PROCESSING LETTERS, 2023, 55 (02) : 1293 - 1316
  • [27] Research on Chinese News Text Classification Based on ERNIE Model
    Zhang, Wenxu
    PROCEEDINGS OF THE WORLD CONFERENCE ON INTELLIGENT AND 3-D TECHNOLOGIES, WCI3DT 2022, 2023, 323 : 89 - 100
  • [28] NCUEE at MEDIQA 2019: Medical Text Inference Using Ensemble BERT-BiLSTM-Attention Model
    Lee, Lung-Hao
    Lu, Yi
    Chen, Po-Han
    Lee, Po-Lei
    Shyu, Kuo-Kai
    SIGBIOMED WORKSHOP ON BIOMEDICAL NATURAL LANGUAGE PROCESSING (BIONLP 2019), 2019, : 528 - 532
  • [29] BiLSTM Model With Attention Mechanism for Sentiment Classification on Chinese Mixed Text Comments
    Li, Xiaoyan
    Raga, Rodolfo C.
    IEEE ACCESS, 2023, 11 : 26199 - 26210
  • [30] Multi-label Text Classification Model Combining BiLSTM and Hypergraph Attention
    Wang, Xing
    Hu, HuiTing
    Zhu, GuoHua
    2024 4TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE, CCAI 2024, 2024, : 344 - 349