BiLGAT: Bidirectional lattice graph attention network for chinese short text classification

被引:5
|
作者
Lyu, Penghao [1 ]
Rao, Guozheng [2 ]
Zhang, Li [3 ]
Cong, Qing [2 ]
机构
[1] Tianjin Univ, Int Engn Inst, Tianjin 300350, Peoples R China
[2] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
[3] Tianjin Univ Sci & Technol, Sch Econ & Management, Tianjin 300350, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph representation learning; Lattice embedding graph; Chinese short text classification; Graph attention networks; Pretrained language models;
D O I
10.1007/s10489-023-04700-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Chinese short text classification approaches based on lexicon information and pretrained language models have yielded state-of-the-art results. However, they simply use the pretrained language model as an embedding layer and fuse lexicon features while not fully utilizing the advantages of either. In this paper, we propose a new model, the bidirectional lattice graph attention network (BiLGAT). It enhances the representation of characters by aggregating the features of different hidden states of BERT. The lexicon features in the lattice graph are fused into character features with the powerful representation capability of the graph attention network, and the problem of word segmentation error propagation is solved at the same time. The experimental results on three Chinese short text classification datasets demonstrate the superior performance of this method. Among these datasets, 94.75% accuracy was achieved on THUCNEWS, 70.71% accuracy was achieved on TNEWS, and 86.49% accuracy was achieved on CNT.
引用
收藏
页码:22405 / 22414
页数:10
相关论文
共 50 条
  • [31] Chinese text dual attention network for aspect-level sentiment classification
    Sun, Xinjie
    Liu, Zhifang
    Li, Hui
    Ying, Feng
    Tao, Yu
    PLOS ONE, 2024, 19 (03):
  • [32] Text Level Graph Neural Network for Text Classification
    Huang, Lianzhe
    Ma, Dehong
    Li, Sujian
    Zhang, Xiaodong
    Wang, Houfeng
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3444 - 3450
  • [33] A Hybrid Bidirectional Recurrent Convolutional Neural Network Attention-Based Model for Text Classification
    Zheng, Jin
    Zheng, Limin
    IEEE ACCESS, 2019, 7 : 106673 - 106685
  • [34] Review of Chinese Short Text Classification
    Wu, Fenlin
    Gou, Jin
    Wang, Cheng
    INDUSTRIAL INSTRUMENTATION AND CONTROL SYSTEMS II, PTS 1-3, 2013, 336-338 : 2171 - +
  • [35] A Gated Graph Neural Network With Attention for Text Classification Based on Coupled P Systems
    Zhang, Jiaqi
    Liu, Xiyu
    IEEE ACCESS, 2023, 11 : 72448 - 72461
  • [36] An chinese text classification algorithmbased on graph spacemodel
    Jia, Xiaoqiang
    BioTechnology: An Indian Journal, 2013, 8 (06) : 787 - 794
  • [37] Bidirectional LSTM with attention mechanism and convolutional layer for text classification
    Liu, Gang
    Guo, Jiabao
    NEUROCOMPUTING, 2019, 337 : 325 - 338
  • [38] Topic-aware cosine graph convolutional neural network for short text classification
    Min C.
    Chu Y.
    Lin H.
    Wang B.
    Yang L.
    Xu B.
    Soft Computing, 2024, 28 (13-14) : 8119 - 8132
  • [39] Document and Word Representations Generated by Graph Convolutional Network and BERT for Short Text Classification
    Ye, Zhihao
    Jiang, Gongyao
    Liu, Ye
    Li, Zhiyong
    Yuan, Jin
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 2275 - 2281
  • [40] Joint Training Graph Neural Network for the Bidding Project Title Short Text Classification
    Li, Shengnan
    Wu, Xiaoming
    Liu, Xiangzhi
    Xue, Xuqiang
    Yu, Yang
    WEB AND BIG DATA, PT I, APWEB-WAIM 2023, 2024, 14331 : 252 - 267