T-HyperGNNs: Hypergraph Neural Networks via Tensor Representations

被引:8
|
作者
Wang, Fuli [1 ]
Pena-Pena, Karelia [2 ]
Qian, Wei [3 ]
Arce, Gonzalo R. [2 ]
机构
[1] Univ Delaware, Inst Financial Serv Analyt, Newark, DE 19716 USA
[2] Univ Delaware, Dept Elect & Comp Engn, Newark, DE 19716 USA
[3] Univ Delaware, Dept Appl Econ & Stat, Newark, DE 19716 USA
基金
美国国家科学基金会;
关键词
Convolution; hypergraphs; message passing; neural networks; tensors;
D O I
10.1109/TNNLS.2024.3371382
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-of-the-art performance on broad applications, there have been limited attempts at exploring high-dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this article, we depart from hypergraph matrix representations and present a new tensor-HyperGNN (T-HyperGNN) framework with cross-node interactions (CNIs). The T-HyperGNN framework consists of T-spectral convolution, T-spatial convolution, and T-message-passing HyperGNNs (T-MPHN). The T-spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To improve computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor-message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-the-art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a CNI layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets. The implementation code is available at https://github.com/wangfuli/T-HyperGNNs.git.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [31] Learning Invariant Representations of Graph Neural Networks via Cluster Generalization
    Xia, Donglin
    Wang, Xiao
    Liu, Nian
    Shi, Chuan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [32] Efficient and Compact Representations of Deep Neural Networks via Entropy Coding
    Marino, Giosue Cataldo
    Furia, Flavio
    Malchiodi, Dario
    Frasca, Marco
    IEEE ACCESS, 2023, 11 : 106103 - 106125
  • [33] Stress Representations for Tensor Basis Neural Networks: Alternative Formulations to Finger-Rivlin-Ericksen
    Fuhg, Jan N.
    Bouklas, Nikolaos
    Jones, Reese E.
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2024, 24 (11)
  • [34] Dynamic hypergraph neural networks based on key hyperedges
    Kang, Xiaojun
    Li, Xinchuan
    Yao, Hong
    Li, Dan
    Jiang, Bo
    Peng, Xiaoyue
    Wu, Tiejun
    Qi, Shihua
    Dong, Lijun
    INFORMATION SCIENCES, 2022, 616 : 37 - 51
  • [35] Android Malware Detection Based on Hypergraph Neural Networks
    Zhang, Dehua
    Wu, Xiangbo
    He, Erlu
    Guo, Xiaobo
    Yang, Xiaopeng
    Li, Ruibo
    Li, Hao
    Vaccaro, Ugo
    APPLIED SCIENCES-BASEL, 2023, 13 (23):
  • [36] TENSOR DECOMPOSITION VIA CORE TENSOR NETWORKS
    Zhang, Jianfu
    Tao, Zerui
    Zhang, Liqing
    Zhao, Qibin
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2130 - 2134
  • [37] Betweenness Approximation for Edge Computing with Hypergraph Neural Networks
    Guo, Yaguang
    Xie, Wenxin
    Wang, Qingren
    Yan, Dengcheng
    Zhang, Yiwen
    TSINGHUA SCIENCE AND TECHNOLOGY, 2025, 30 (01): : 331 - 344
  • [38] UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks
    Huang, Jing
    Yang, Jie
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2563 - 2569
  • [39] Parallel Hypergraph Convolutional Neural Networks for Image Annotation
    Wang, Mengke
    Liu, Weifeng
    Yuan, Xinan
    Li, Wei
    Liu, Baodi
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 6582 - 6587
  • [40] QGTC: Accelerating Quantized Graph Neural Networks via GPU Tensor Core
    Wang, Yuke
    Feng, Boyuan
    Ding, Yufei
    PPOPP'22: PROCEEDINGS OF THE 27TH ACM SIGPLAN SYMPOSIUM ON PRINCIPLES AND PRACTICE OF PARALLEL PROGRAMMING, 2022, : 107 - 119