Improved Modeling and Generalization Capabilities of Graph Neural Networks With Legendre Polynomials

被引:1
|
作者
Chen, Jiali [1 ]
Xu, Liwen [1 ]
机构
[1] North China Univ Technol, Coll Sci, Beijing 100144, Peoples R China
来源
IEEE ACCESS | 2023年 / 11卷
关键词
LegendreNet; graph neural networks; high-order dependencies; Legendre polynomials; robustness; spectral filtering;
D O I
10.1109/ACCESS.2023.3289002
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
LegendreNet is a novel graph neural network (GNNs) model that addresses stability issues present in traditional GNN models such as ChebNet, while also more effectively capturing higher-order dependencies within graphical data. Compared to traditional GNNs models such as GCN, LegendreNet is better equipped to handle large-scale graphical data, demonstrating superior performance on such datasets. Furthermore, Legendre polynomials, which are a set of completely orthogonal polynomials, are capable of approximating any function to arbitrary precision within a bounded interval. As such, when applied to graph neural networks, Legendre polynomials provide a more precise and stable means of fitting spectral filters to graphical data. This enables LegendreNet to more accurately capture graphical features when dealing with complex graphical data, and to exhibit greater robustness in adversarial attack scenarios. Compared to traditional GNNs methods, LegendreNet offers improved modeling and generalization capabilities, making it a more effective solution across various graphical data applications. Our experiments have demonstrated that our model outperforms state-of-the-art methods on large-scale graphical datasets. The code for LegendreNet is available at https://github.com/12chen20/LegendreNet.
引用
收藏
页码:63442 / 63450
页数:9
相关论文
共 50 条
  • [31] Training feedforward neural networks: An algorithm giving improved generalization
    Lee, CW
    NEURAL NETWORKS, 1997, 10 (01) : 61 - 68
  • [32] CMAC neural network with improved generalization property for system modeling
    Horváth, G
    Szabó, T
    IMTC 2002: PROCEEDINGS OF THE 19TH IEEE INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE, VOLS 1 & 2, 2002, : 1603 - 1608
  • [33] GENERALIZATION AND APPROXIMATION CAPABILITIES OF MULTILAYER NETWORKS
    TAKAHASHI, Y
    NEURAL COMPUTATION, 1993, 5 (01) : 132 - 139
  • [34] Recommendation system based on improved graph neural networks
    Chen, Jiawen
    Cai, Chao
    Cai, Yong
    Yan, Fangbin
    Li, Jiayi
    International Journal of Wireless and Mobile Computing, 2024, 27 (03) : 290 - 296
  • [35] Modeling gait transitions of quadrupeds and their generalization with CMAC neural networks
    Lin, JN
    Song, SM
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2002, 32 (03): : 177 - 189
  • [36] Modeling TCP Performance using Graph Neural Networks
    Jaeger, Benedikt
    Helm, Max
    Schwegmann, Lars
    Carle, Georg
    PROCEEDINGS OF THE 1ST INTERNATIONAL WORKSHOP ON GRAPH NEURAL NETWORKING, GNNET 2022, 2022, : 18 - 23
  • [37] GENERALIZATION BY NEURAL NETWORKS
    SHEKHAR, S
    AMIN, MB
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 1992, 4 (02) : 177 - 185
  • [38] On generalization by neural networks
    Kak, SC
    INFORMATION SCIENCES, 1998, 111 (1-4) : 293 - 302
  • [39] xNet: Modeling Network Performance With Graph Neural Networks
    Huang, Sijiang
    Wei, Yunze
    Peng, Lingfeng
    Wang, Mowei
    Hui, Linbo
    Liu, Peng
    Du, Zongpeng
    Liu, Zhenhua
    Cui, Yong
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (02) : 1753 - 1767
  • [40] Wasserstein Barycenter Matching for Graph Size Generalization of Message Passing Neural Networks
    Chu, Xu
    Jin, Yujie
    Wang, Xin
    Zhang, Shanghang
    Wang, Yasha
    Zhu, Wenwu
    Mei, Hong
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202