T-HyperGNNs: Hypergraph Neural Networks via Tensor Representations

被引:8
|
作者
Wang, Fuli [1 ]
Pena-Pena, Karelia [2 ]
Qian, Wei [3 ]
Arce, Gonzalo R. [2 ]
机构
[1] Univ Delaware, Inst Financial Serv Analyt, Newark, DE 19716 USA
[2] Univ Delaware, Dept Elect & Comp Engn, Newark, DE 19716 USA
[3] Univ Delaware, Dept Appl Econ & Stat, Newark, DE 19716 USA
基金
美国国家科学基金会;
关键词
Convolution; hypergraphs; message passing; neural networks; tensors;
D O I
10.1109/TNNLS.2024.3371382
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-of-the-art performance on broad applications, there have been limited attempts at exploring high-dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this article, we depart from hypergraph matrix representations and present a new tensor-HyperGNN (T-HyperGNN) framework with cross-node interactions (CNIs). The T-HyperGNN framework consists of T-spectral convolution, T-spatial convolution, and T-message-passing HyperGNNs (T-MPHN). The T-spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To improve computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor-message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-the-art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a CNI layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets. The implementation code is available at https://github.com/wangfuli/T-HyperGNNs.git.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [41] Time-varying generalized tensor eigenanalysis via Zhang neural networks
    Mo, Changxin
    Wang, Xuezhong
    Wei, Yimin
    NEUROCOMPUTING, 2020, 407 : 465 - 479
  • [42] Learning Multi-Way Relations via Tensor Decomposition with Neural Networks
    Maruhashi, Koji
    Todoriki, Masaru
    Ohwa, Takuya
    Goto, Keisuke
    Hasegawa, Yu
    Inakoshi, Hiroya
    Anai, Hirokazu
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3770 - 3777
  • [43] HyperFormer: Learning Expressive Sparse Feature Representations via Hypergraph Transformer
    Ding, Kaize
    Liang, Albert Jiongqian
    Perrozi, Bryan
    Chen, Ting
    Wang, Ruoxi
    Hong, Lichan
    Chi, Ed H.
    Liu, Huan
    Cheng, Derek Zhiyuan
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 2062 - 2066
  • [44] Identifying new cancer genes based on the integration of annotated gene sets via hypergraph neural networks
    Deng, Chao
    Li, Hong-Dong
    Zhang, Li-Shen
    Liu, Yiwei
    Li, Yaohang
    Wang, Jianxin
    BIOINFORMATICS, 2024, 40 : i511 - i520
  • [45] Sparse Recovery of Time-Frequency Representations via Recurrent Neural Networks
    Khalifa, Yassin
    Zhang, Zhenwei
    Sejdie, Ervin
    2017 22ND INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2017,
  • [46] Convolutional neural networks analysed via inverse problem theory and sparse representations
    Tarhan, Cem
    Akar, Gozde Bozdagi
    IET SIGNAL PROCESSING, 2019, 13 (02) : 215 - 223
  • [47] Sparse relation prediction based on hypergraph neural networks in online social networks
    Guan, Yuanshen
    Sun, Xiangguo
    Sun, Yongjiao
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (01): : 7 - 31
  • [48] Tensor-Factorized Neural Networks
    Chien, Jen-Tzung
    Bao, Yi-Ting
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) : 1998 - 2011
  • [49] Sparse relation prediction based on hypergraph neural networks in online social networks
    Yuanshen Guan
    Xiangguo Sun
    Yongjiao Sun
    World Wide Web, 2023, 26 : 7 - 31
  • [50] Distributed constrained combinatorial optimization leveraging hypergraph neural networks
    Heydaribeni, Nasimeh
    Zhan, Xinrui
    Zhang, Ruisi
    Eliassi-Rad, Tina
    Koushanfar, Farinaz
    NATURE MACHINE INTELLIGENCE, 2024, : 664 - +