A Heterogeneous Directed Graph Attention Network for inductive text classification using multilevel semantic embeddings

被引:2
|
作者
Lin, Mu [1 ]
Wang, Tao [1 ]
Zhu, Yifan [1 ]
Li, Xiaobo [1 ]
Zhou, Xin [1 ]
Wang, Weiping [1 ]
机构
[1] Natl Univ Def Technol, Coll Syst Engn, Changsha 410073, Hunan, Peoples R China
关键词
Text classification; Multilevel semantics; Graph Neural Networks; Graph Attention Networks; Text segmentation; Sentence-transformer; MODEL;
D O I
10.1016/j.knosys.2024.111797
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the current study, a novel network model is proposed for text classification based on Graph Attention Networks (GATs) and sentence -transformer embeddings. Most existing methods with a pretraining model as an input layer still treat words as the minimum processing unit. However, word embedding is not an efficient and appropriate solution when dealing with long texts containing many professional words. This study aims to design a model capable of handling text classification tasks at multilevel semantic segmentation. The main contribution of this study is that a novel GAT variant is designed using global nodes and Squeeze -and -Excitation Networks (SENet) to capture semantic information. Moreover, a novel unidirectional attention mechanism is introduced for our model to avoid the message passing of irrelevant noisy information within global nodes. The numerical results show that according to the characteristics of datasets, specific semantic information combinations can effectively improve the accuracy and performance of text classification. Without fine-tuning of the pretrained encoder, the new state-of-the-art performance is achieved on three benchmark datasets. In addition, a comprehensive analysis of the graph attention mechanism in the model for specific cases suggests that the unidirectional attention mechanism and the use of global nodes are key contributing factors to multilevel semantic fusion.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] GACaps-HTC: graph attention capsule network for hierarchical text classification
    Jinhyun Bang
    Jonghun Park
    Jonghyuk Park
    Applied Intelligence, 2023, 53 : 20577 - 20594
  • [32] GACaps-HTC: graph attention capsule network for hierarchical text classification
    Bang, Jinhyun
    Park, Jonghun
    Park, Jonghyuk
    APPLIED INTELLIGENCE, 2023, 53 (17) : 20577 - 20594
  • [33] MAGNET: Multi-Label Text Classification using Attention-based Graph Neural Network
    Pal, Ankit
    Selvakumar, Muru
    Sankarasubbu, Malaikannan
    ICAART: PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 2, 2020, : 494 - 505
  • [34] Multimodal heterogeneous graph attention network
    Jia, Xiangen
    Jiang, Min
    Dong, Yihong
    Zhu, Feng
    Lin, Haocai
    Xin, Yu
    Chen, Huahui
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (04): : 3357 - 3372
  • [35] A Word-Concept Heterogeneous Graph Convolutional Network for Short Text Classification
    Yang, Shigang
    Liu, Yongguo
    Zhang, Yun
    Zhu, Jiajing
    NEURAL PROCESSING LETTERS, 2023, 55 (01) : 735 - 750
  • [36] A Word-Concept Heterogeneous Graph Convolutional Network for Short Text Classification
    Shigang Yang
    Yongguo Liu
    Yun Zhang
    Jiajing Zhu
    Neural Processing Letters, 2023, 55 : 735 - 750
  • [37] Online Sensitive Text Classification Model Based on Heterogeneous Graph Convolutional Network
    Gao, Haoxin
    Sun, Lijuan
    Wu, Jingchen
    Gao, Yutong
    Wu, Xu
    Data Analysis and Knowledge Discovery, 2023, 7 (11): : 26 - 36
  • [38] Multimodal heterogeneous graph attention network
    Xiangen Jia
    Min Jiang
    Yihong Dong
    Feng Zhu
    Haocai Lin
    Yu Xin
    Huahui Chen
    Neural Computing and Applications, 2023, 35 : 3357 - 3372
  • [39] Heterogeneous Graph Gated Attention Network
    Ma, Shuai
    Liu, Jian-wei
    Zuo, Xin
    Li, Wei-min
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [40] Heterogeneous Graph-Convolution-Network-Based Short-Text Classification
    Hua, Jiwei
    Sun, Debing
    Hu, Yanxiang
    Wang, Jiayu
    Feng, Shuquan
    Wang, Zhaoyang
    APPLIED SCIENCES-BASEL, 2024, 14 (06):