Research on News Text Classification Based on BERT-BiLSTM-TextCNN-Attention

被引:0
|
作者
Wang, Jia [1 ]
Li, Zongting [2 ]
Ma, Chenyang [2 ]
机构
[1] Dalian Polytech Univ, Dalian 116034, Liaoning, Peoples R China
[2] Dalian Polytech Univ, Sch Informat Sci & Engn, Dalian 116034, Liaoning, Peoples R China
关键词
Deep learning; text classification; natural language processing; neural network;
D O I
10.1145/3672919.3672973
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional machine learning models are difficult to capture complex features and contextual relationships. While a singular deep learning architecture surpasses machine learning in text processing, it falls short of encompassing the entirety of textual information [5]. Enter a novel approach: a news text classifier built upon BERT-BiLSTM-TextCNN-Attention. This model employs BERT's pre-trained language models to delve into text content. It then channels this data into a BiLSTM layer, capturing sequence nuances and long-term dependencies for comprehensive semantic insight. Following this, the output moves through a TextCNN layer, effectively capturing local semantic cues through convolution. The model culminates with attention mechanisms that highlight pivotal text attributes, refining feature vectors for the Softmax layer's classification. The experimentation utilized a subset of the THUCNews Chinese news text dataset. Results indicate that the BERT BiLSTM TextCNN Attention model achieved 96.48% accuracy, outperforming other benchmarks. This underscores its superiority in handling Chinese news text classification and validating its prowess in extracting deep semantic nuances and crucial local features from the text.
引用
收藏
页码:295 / 298
页数:4
相关论文
共 50 条
  • [1] Research on sentiment classification for netizens based on the BERT-BiLSTM-TextCNN model
    Jiang, Xuchu
    Song, Chao
    Xu, Yucheng
    Li, Ying
    Peng, Yili
    PEERJ COMPUTER SCIENCE, 2022, 8
  • [2] Chinese News Text Classification based on Attention-based CNN-BiLSTM
    Wang, Meng
    Cai, Qiong
    Wang, Liya
    Li, Jun
    Wang, Xiaoke
    MIPPR 2019: PATTERN RECOGNITION AND COMPUTER VISION, 2020, 11430
  • [3] Research on news text classification based on improved BERT-UNet model
    Li, Zeqin
    Liu, Jianwen
    Lin, Jin
    Tan, Deli
    Gong, Ruyue
    Wang, Linglin
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON MODELING, NATURAL LANGUAGE PROCESSING AND MACHINE LEARNING, CMNM 2024, 2024, : 1 - 7
  • [4] Research on sentiment analysis of hotel review text based on BERT-TCN-BiLSTM-attention model
    Chi, Dianwei
    Huang, Tiantian
    Jia, Zehao
    Zhang, Sining
    ARRAY, 2025, 25
  • [5] Research on Chinese Microblog Sentiment Classification Based on TextCNN-BiLSTM Model
    Tang, Haiqin
    Zhang, Ruirui
    JOURNAL OF INFORMATION PROCESSING SYSTEMS, 2023, 19 (06): : 842 - 857
  • [6] Enhancing text classification with attention matrices based on BERT
    Yu, Zhiyi
    Li, Hong
    Feng, Jialin
    EXPERT SYSTEMS, 2024, 41 (03)
  • [7] Research on Intelligent Classification Method of Seismic Information Text Based on BERT-BiLSTM Optimization Algorithm
    Wang Zhonghao
    Li Chenxi
    Huang Meng
    Liu Shuai
    2022 IEEE 2ND INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE (CCAI 2022), 2022, : 55 - 59
  • [8] Research on Public Service Request Text Classification Based on BERT-BiLSTM-CNN Feature Fusion
    Xiong, Yunpeng
    Chen, Guolian
    Cao, Junkuo
    APPLIED SCIENCES-BASEL, 2024, 14 (14):
  • [9] A Multiscale Interactive Attention Short Text Classification Model Based on BERT
    Zhou, Lu
    Wang, Peng
    Zhang, Huijun
    Wu, Shengbo
    Zhang, Tao
    IEEE ACCESS, 2024, 12 : 160992 - 161001
  • [10] Text Classification Research Based on Bert Model and Bayesian Network
    Liu, Songsong
    Tao, Haijun
    Feng, Shiling
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 5842 - 5846