Hypergraph Attention Networks

被引:11
|
作者
Chen, Chaofan [1 ]
Cheng, Zelei [2 ]
Li, Zuotian [3 ]
Wang, Manyi [4 ]
机构
[1] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei, Peoples R China
[2] Purdue Univ, Dept Comp & Informat Technol, W Lafayette, IN 47907 USA
[3] Carnegie Mellon Univ, Integrated Innovat Inst, Mountain View, CA USA
[4] Beijing Univ Posts & Telecommun, Int Sch, Beijing, Peoples R China
关键词
Hypergraph; Graph Neural Networks; Attention Module; Object Recognition;
D O I
10.1109/TrustCom50675.2020.00215
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, graph neural networks have achieved great success on the representation learning of the graph-structured data. However, these networks just consider the pairwise connection between nodes which cannot model the complicated connections of data in the real world. Thus, researchers began to pay attention to the hypergraph modeling. In recent years, some hypergraph neural networks have been proposed to aggregate the information of the hypergraph for representation learning. In this paper, we present hypergraph attention networks (HGATs) to encode the high-order data relation in the hypergraph. Specifically, our proposed HGATs consist of two modules: attentive vertex aggregation module and attentive hyperedge aggregation module. These two modules can implicitly assign different aggregation weights to different connected hyperedge/vertex to characterize the complex relations among data. We stack these modules to pass the messages between the hyperedges and vertices to refine the vertex/hyperedge features. Experimental results on the ModelNet40 and NTU2012 datasets show that our proposed HGATs can achieve superior performance for the visual object recognition tasks. Furthermore, we employ our HGAT for multi-view representation learning and better object classification results are achieved.
引用
收藏
页码:1560 / 1565
页数:6
相关论文
共 50 条
  • [1] Hypergraph convolution and hypergraph attention
    Bai, Song
    Zhang, Feihu
    Torr, Philip H. S.
    PATTERN RECOGNITION, 2021, 110
  • [2] Metro Flow Prediction with Hierarchical Hypergraph Attention Networks
    Wang J.
    Zhang Y.
    Hu Y.
    Yin B.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (06): : 3012 - 3021
  • [3] Be More with Less: Hypergraph Attention Networks for Inductive Text Classification
    Ding, Kaize
    Wang, Jianling
    Li, Jundong
    Li, Dingcheng
    Liu, Huan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4927 - 4936
  • [4] Multi-head Attention Induced Dynamic Hypergraph Convolutional Networks
    Peng, Xu
    Lin, Wei
    Jin, Taisong
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IX, 2024, 14433 : 256 - 268
  • [5] Hypergraph contrastive attention networks for hyperedge prediction with negative samples evaluation
    Wang, Junbo
    Chen, Jianrui
    Wang, Zhihui
    Gong, Maoguo
    NEURAL NETWORKS, 2025, 181
  • [6] Hypergraph Neural Networks for Hypergraph Matching
    Liao, Xiaowei
    Xu, Yong
    Ling, Haibin
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 1246 - 1255
  • [7] Stock trend prediction based on industry relationships driven hypergraph attention networks
    Haodong Han
    Liang Xie
    Shengshuang Chen
    Haijiao Xu
    Applied Intelligence, 2023, 53 : 29448 - 29464
  • [8] Stock trend prediction based on industry relationships driven hypergraph attention networks
    Han, Haodong
    Xie, Liang
    Chen, Shengshuang
    Xu, Haijiao
    APPLIED INTELLIGENCE, 2023, 53 (23) : 29448 - 29464
  • [9] Attention based adaptive spatial-temporal hypergraph convolutional networks for stock trend
    Su, Hongyang
    Wang, Xiaolong
    Qin, Yang
    Chen, Qingcai
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [10] Hypergraph Position Attention Convolution Networks for 3D Point Cloud Segmentation
    Rong, Yanpeng
    Nong, Liping
    Liang, Zichen
    Huang, Zhuocheng
    Peng, Jie
    Huang, Yiping
    APPLIED SCIENCES-BASEL, 2024, 14 (08):