Gapformer: Graph Transformer with Graph Pooling for Node Classification

被引:0
|
作者
Liu, Chuang [1 ]
Zhan, Yibing [2 ]
Ma, Xueqi [3 ]
Ding, Liang [2 ]
Tao, Dapeng [4 ,5 ]
Wu, Jia [6 ]
Hu, Wenbin [1 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan, Peoples R China
[2] JD Com, JD Explore Acad, Beijing, Peoples R China
[3] Univ Melbourne, Sch Comp & Informat Syst, Melbourne, Australia
[4] Yunnan Univ, Sch Comp Sci, Kunming, Peoples R China
[5] Yunnan Key Lab Media Convergence, Kunming, Peoples R China
[6] Macquarie Univ, Sch Comp, Sydney, Australia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Transformers (GTs) have proved their advantage in graph-level tasks. However, existing GTs still perform unsatisfactorily on the node classification task due to 1) the overwhelming unrelated information obtained from a vast number of irrelevant distant nodes and 2) the quadratic complexity regarding the number of nodes via the fully connected attention mechanism. In this paper, we present Gapformer, a method for node classification that deeply incorporates Graph Transformer with Graph Pooling. More specifically, Gapformer coarsens the large-scale nodes of a graph into a smaller number of pooling nodes via local or global graph pooling methods, and then computes the attention solely with the pooling nodes rather than all other nodes. In such a manner, the negative influence of the overwhelming unrelated nodes is mitigated while maintaining the long-range information, and the quadratic complexity is reduced to linear complexity with respect to the fixed number of pooling nodes. Extensive experiments on 13 node classification datasets, including homophilic and heterophilic graph datasets, demonstrate the competitive performance of Gapformer over existing Graph Neural Networks and GTs.
引用
收藏
页码:2196 / 2205
页数:10
相关论文
共 50 条
  • [41] EGCN: A Node Classification Model Based on Transformer and Spatial Feature Attention GCN for Dynamic Graph
    Cao, Yunqi
    Chen, Haopeng
    Ruan, Jinteng
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 357 - 368
  • [42] Hierarchical Representation Learning in Graph Neural Networks With Node Decimation Pooling
    Bianchi, Filippo Maria
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 2195 - 2207
  • [43] Property graph representation learning for node classification
    Shu Li
    Nayyar A. Zaidi
    Meijie Du
    Zhou Zhou
    Hongfei Zhang
    Gang Li
    Knowledge and Information Systems, 2024, 66 (1) : 237 - 265
  • [44] INK: knowledge graph embeddings for node classification
    Steenwinckel, Bram
    Vandewiele, Gilles
    Weyns, Michael
    Agozzino, Terencio
    De Turck, Filip
    Ongenae, Femke
    DATA MINING AND KNOWLEDGE DISCOVERY, 2022, 36 (02) : 620 - 667
  • [45] Federated Graph Augmentation for Semisupervised Node Classification
    Xia, Zhichang
    Zhang, Xinglin
    Liang, Lingyu
    Li, Yun
    Gong, Yuejiao
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (03): : 3232 - 3242
  • [46] Algorithmic node classification in AND/OR mobile workflow graph
    Ali, Ihtisham
    Bagchi, Susmit
    INTERNATIONAL JOURNAL OF GRID AND UTILITY COMPUTING, 2020, 11 (02) : 143 - 168
  • [47] Property graph representation learning for node classification
    Li, Shu
    Zaidi, Nayyar A.
    Du, Meijie
    Zhou, Zhou
    Zhang, Hongfei
    Li, Gang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (01) : 237 - 265
  • [48] INK: knowledge graph embeddings for node classification
    Bram Steenwinckel
    Gilles Vandewiele
    Michael Weyns
    Terencio Agozzino
    Filip De Turck
    Femke Ongenae
    Data Mining and Knowledge Discovery, 2022, 36 : 620 - 667
  • [49] Ensembling Graph Neural Networks for Node Classification
    Lin, Ke-Ao
    Xie, Xiao-Zhu
    Weng, Wei
    Chen, Yong
    Journal of Network Intelligence, 2024, 9 (02): : 804 - 818
  • [50] Encoder embedding for general graph and node classification
    Shen, Cencheng
    APPLIED NETWORK SCIENCE, 2024, 9 (01)