T-HyperGNNs: Hypergraph Neural Networks via Tensor Representations

被引:8
|
作者
Wang, Fuli [1 ]
Pena-Pena, Karelia [2 ]
Qian, Wei [3 ]
Arce, Gonzalo R. [2 ]
机构
[1] Univ Delaware, Inst Financial Serv Analyt, Newark, DE 19716 USA
[2] Univ Delaware, Dept Elect & Comp Engn, Newark, DE 19716 USA
[3] Univ Delaware, Dept Appl Econ & Stat, Newark, DE 19716 USA
基金
美国国家科学基金会;
关键词
Convolution; hypergraphs; message passing; neural networks; tensors;
D O I
10.1109/TNNLS.2024.3371382
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hypergraph neural networks (HyperGNNs) are a family of deep neural networks designed to perform inference on hypergraphs. HyperGNNs follow either a spectral or a spatial approach, in which a convolution or message-passing operation is conducted based on a hypergraph algebraic descriptor. While many HyperGNNs have been proposed and achieved state-of-the-art performance on broad applications, there have been limited attempts at exploring high-dimensional hypergraph descriptors (tensors) and joint node interactions carried by hyperedges. In this article, we depart from hypergraph matrix representations and present a new tensor-HyperGNN (T-HyperGNN) framework with cross-node interactions (CNIs). The T-HyperGNN framework consists of T-spectral convolution, T-spatial convolution, and T-message-passing HyperGNNs (T-MPHN). The T-spectral convolution HyperGNN is defined under the t-product algebra that closely connects to the spectral space. To improve computational efficiency for large hypergraphs, we localize the T-spectral convolution approach to formulate the T-spatial convolution and further devise a novel tensor-message-passing algorithm for practical implementation by studying a compressed adjacency tensor representation. Compared to the state-of-the-art approaches, our T-HyperGNNs preserve intrinsic high-order network structures without any hypergraph reduction and model the joint effects of nodes through a CNI layer. These advantages of our T-HyperGNNs are demonstrated in a wide range of real-world hypergraph datasets. The implementation code is available at https://github.com/wangfuli/T-HyperGNNs.git.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [21] Molecular contrastive learning of representations via graph neural networks
    Wang, Yuyang
    Wang, Jianren
    Cao, Zhonglin
    Farimani, Amir Barati
    NATURE MACHINE INTELLIGENCE, 2022, 4 (03) : 279 - 287
  • [22] Learning Deep Graph Representations via Convolutional Neural Networks
    Ye, Wei
    Askarisichani, Omid
    Jones, Alex
    Singh, Ambuj
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (05) : 2268 - 2279
  • [23] Molecular contrastive learning of representations via graph neural networks
    Yuyang Wang
    Jianren Wang
    Zhonglin Cao
    Amir Barati Farimani
    Nature Machine Intelligence, 2022, 4 : 279 - 287
  • [24] Learning Hypergraphs Tensor Representations From Data via t-HGSP
    Pena-Pena, Karelia
    Taipe, Lucas
    Wang, Fuli
    Lau, Daniel L.
    Arce, Gonzalo R.
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2024, 10 : 17 - 31
  • [25] HGNN+: General Hypergraph Neural Networks
    Gao, Yue
    Feng, Yifan
    Ji, Shuyi
    Ji, Rongrong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (03) : 3181 - 3199
  • [26] TQCompressor: improving tensor decomposition methods in neural networks via permutations
    Abronin, Vadim
    Naumov, Aleksei
    Mazur, Denis
    Bystrov, Dmitriy
    Tsarova, Katerina
    Melnikov, Artem
    Dolgov, Sergey
    Brasher, Reuben
    Perelshein, Michael
    2024 IEEE 7TH INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL, MIPR 2024, 2024, : 503 - 506
  • [27] Edge-based Tensor prediction via graph neural networks
    Zhong, Yang
    Yu, Hongyu
    Gong, Xingao
    Xiang, Hongjun
    arXiv, 2022,
  • [28] Defending adversarial attacks in Graph Neural Networks via tensor enhancement
    Zhang, Jianfu
    Hong, Yan
    Cheng, Dawei
    Zhang, Liqing
    Zhao, Qibin
    PATTERN RECOGNITION, 2025, 158
  • [29] Learning Polynomial Neural Networks via Low Rank Tensor Recovery
    Hucumenoglu, Mehmet Can
    Pal, Piya
    2020 54TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2020, : 361 - 365
  • [30] Compact Neural Architecture Designs by Tensor Representations
    Su, Jiahao
    Li, Jingling
    Liu, Xiaoyu
    Ranadive, Teresa
    Coley, Christopher
    Tuan, Tai-Ching
    Huang, Furong
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2022, 5