Research on News Text Classification Based on BERT-BiLSTM-TextCNN-Attention

被引:0
|
作者
Wang, Jia [1 ]
Li, Zongting [2 ]
Ma, Chenyang [2 ]
机构
[1] Dalian Polytech Univ, Dalian 116034, Liaoning, Peoples R China
[2] Dalian Polytech Univ, Sch Informat Sci & Engn, Dalian 116034, Liaoning, Peoples R China
关键词
Deep learning; text classification; natural language processing; neural network;
D O I
10.1145/3672919.3672973
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional machine learning models are difficult to capture complex features and contextual relationships. While a singular deep learning architecture surpasses machine learning in text processing, it falls short of encompassing the entirety of textual information [5]. Enter a novel approach: a news text classifier built upon BERT-BiLSTM-TextCNN-Attention. This model employs BERT's pre-trained language models to delve into text content. It then channels this data into a BiLSTM layer, capturing sequence nuances and long-term dependencies for comprehensive semantic insight. Following this, the output moves through a TextCNN layer, effectively capturing local semantic cues through convolution. The model culminates with attention mechanisms that highlight pivotal text attributes, refining feature vectors for the Softmax layer's classification. The experimentation utilized a subset of the THUCNews Chinese news text dataset. Results indicate that the BERT BiLSTM TextCNN Attention model achieved 96.48% accuracy, outperforming other benchmarks. This underscores its superiority in handling Chinese news text classification and validating its prowess in extracting deep semantic nuances and crucial local features from the text.
引用
收藏
页码:295 / 298
页数:4
相关论文
共 50 条
  • [41] Tatt-BiLSTM: Web service classification with topical attention-based BiLSTM
    Kang, Guosheng
    Xiao, Yong
    Liu, Jianxun
    Cao, Yingcheng
    Cao, Buqing
    Zhang, Xiangping
    Ding, Linghang
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (16):
  • [42] Text Classification Research with Attention-based Recurrent Neural Networks
    Du, C.
    Huang, L.
    INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, 2018, 13 (01) : 50 - 61
  • [43] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Mohapatra, Asutosh
    Thota, Nithin
    Prakasam, P.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (13) : 18503 - 18519
  • [44] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Asutosh Mohapatra
    Nithin Thota
    P. Prakasam
    Multimedia Tools and Applications, 2022, 81 : 18503 - 18519
  • [45] An Effective Personality-Based Model for Short Text Sentiment Classification Using BiLSTM and Self-Attention
    Liu, Kejian
    Feng, Yuanyuan
    Zhang, Liying
    Wang, Rongju
    Wang, Wei
    Yuan, Xianzhi
    Cui, Xuran
    Li, Xianyong
    Li, Hailing
    ELECTRONICS, 2023, 12 (15)
  • [46] Attention-based C-BiLSTM for fake news detection
    Trueman, Tina Esther
    Kumar, J. Ashok
    Narayanasamy, P.
    Vidya, J.
    APPLIED SOFT COMPUTING, 2021, 110 (110)
  • [47] The Study on the Text Classification Based on Graph Convolutional Network and BiLSTM
    Xue, Bingxin
    Zhu, Cui
    Wang, Xuan
    Zhu, Wenjun
    APPLIED SCIENCES-BASEL, 2022, 12 (16):
  • [48] Text Mining of Power Secondary Equipment Based on BiLSTM-Attention
    Chen, Kai
    Nan, Dongliang
    Sun, Yonghui
    Wang, Kaike
    PROCEEDINGS OF THE 32ND 2020 CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2020), 2020, : 709 - 714
  • [49] MTBERT-Attention: An Explainable BERT Model based on Multi-Task Learning for Cognitive Text Classification
    Sebbaq, Hanane
    El Faddouli, Nour-Eddine
    SCIENTIFIC AFRICAN, 2023, 21
  • [50] Research on News Text Classification Based on Deep Learning Convolutional Neural Network
    Zhu, Yunlong
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2021, 2021