Improved Modeling and Generalization Capabilities of Graph Neural Networks With Legendre Polynomials

被引:1
|
作者
Chen, Jiali [1 ]
Xu, Liwen [1 ]
机构
[1] North China Univ Technol, Coll Sci, Beijing 100144, Peoples R China
来源
IEEE ACCESS | 2023年 / 11卷
关键词
LegendreNet; graph neural networks; high-order dependencies; Legendre polynomials; robustness; spectral filtering;
D O I
10.1109/ACCESS.2023.3289002
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
LegendreNet is a novel graph neural network (GNNs) model that addresses stability issues present in traditional GNN models such as ChebNet, while also more effectively capturing higher-order dependencies within graphical data. Compared to traditional GNNs models such as GCN, LegendreNet is better equipped to handle large-scale graphical data, demonstrating superior performance on such datasets. Furthermore, Legendre polynomials, which are a set of completely orthogonal polynomials, are capable of approximating any function to arbitrary precision within a bounded interval. As such, when applied to graph neural networks, Legendre polynomials provide a more precise and stable means of fitting spectral filters to graphical data. This enables LegendreNet to more accurately capture graphical features when dealing with complex graphical data, and to exhibit greater robustness in adversarial attack scenarios. Compared to traditional GNNs methods, LegendreNet offers improved modeling and generalization capabilities, making it a more effective solution across various graphical data applications. Our experiments have demonstrated that our model outperforms state-of-the-art methods on large-scale graphical datasets. The code for LegendreNet is available at https://github.com/12chen20/LegendreNet.
引用
收藏
页码:63442 / 63450
页数:9
相关论文
共 50 条
  • [21] Finding core labels for maximizing generalization of graph neural networks
    Fu, Sichao
    Ma, Xueqi
    Zhan, Yibing
    You, Fanyu
    Peng, Qinmu
    Liu, Tongliang
    Bailey, James
    Mandic, Danilo
    NEURAL NETWORKS, 2024, 180
  • [22] An Algebraic Generalization for Graph and Tensor-Based Neural Networks
    Jackson, Ethan C.
    Hughes, James Alexander
    Daley, Mark
    Winter, Michael
    2017 IEEE CONFERENCE ON COMPUTATIONAL INTELLIGENCE IN BIOINFORMATICS AND COMPUTATIONAL BIOLOGY (CIBCB), 2017, : 182 - 189
  • [23] Improved generalization performance of convolutional neural networks with LossDA
    Liu, Juncheng
    Zhao, Yili
    APPLIED INTELLIGENCE, 2023, 53 (11) : 13852 - 13866
  • [24] PRUNING RECURRENT NEURAL NETWORKS FOR IMPROVED GENERALIZATION PERFORMANCE
    GILES, CL
    OMLIN, CW
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (05): : 848 - 851
  • [25] Improved generalization performance of convolutional neural networks with LossDA
    Juncheng Liu
    Yili Zhao
    Applied Intelligence, 2023, 53 : 13852 - 13866
  • [26] Timing Macro Modeling with Graph Neural Networks
    Chang, Kevin Kai-Chun
    Chiang, Chun-Yao
    Lee, Pei-Yu
    Jiang, Iris Hui-Ru
    PROCEEDINGS OF THE 59TH ACM/IEEE DESIGN AUTOMATION CONFERENCE, DAC 2022, 2022, : 1219 - 1224
  • [27] Modeling IoT Equipment With Graph Neural Networks
    Zhang, Weishan
    Zhang, Yafei
    Xu, Liang
    Zhou, Jiehan
    Liu, Yan
    Guis, Mu
    Liu, Xin
    Yang, Su
    IEEE ACCESS, 2019, 7 : 32754 - 32764
  • [28] Topological Data Mapping for Improved Generalization Capabilities using Counter Propagation Networks
    Madokoro, H.
    Sato, K.
    JOURNAL OF COMPUTERS, 2012, 7 (11) : 2655 - 2662
  • [29] Interaction of Generalization and Out-of-Distribution Detection Capabilities in Deep Neural Networks
    Aboitiz, Francisco Javier Klaiber
    Legenstein, Robert
    Oezdenizci, Ozan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 248 - 259
  • [30] Learning Invariant Representations of Graph Neural Networks via Cluster Generalization
    Xia, Donglin
    Wang, Xiao
    Liu, Nian
    Shi, Chuan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,