Hypergraph Attention Networks

被引:11
|
作者
Chen, Chaofan [1 ]
Cheng, Zelei [2 ]
Li, Zuotian [3 ]
Wang, Manyi [4 ]
机构
[1] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei, Peoples R China
[2] Purdue Univ, Dept Comp & Informat Technol, W Lafayette, IN 47907 USA
[3] Carnegie Mellon Univ, Integrated Innovat Inst, Mountain View, CA USA
[4] Beijing Univ Posts & Telecommun, Int Sch, Beijing, Peoples R China
关键词
Hypergraph; Graph Neural Networks; Attention Module; Object Recognition;
D O I
10.1109/TrustCom50675.2020.00215
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, graph neural networks have achieved great success on the representation learning of the graph-structured data. However, these networks just consider the pairwise connection between nodes which cannot model the complicated connections of data in the real world. Thus, researchers began to pay attention to the hypergraph modeling. In recent years, some hypergraph neural networks have been proposed to aggregate the information of the hypergraph for representation learning. In this paper, we present hypergraph attention networks (HGATs) to encode the high-order data relation in the hypergraph. Specifically, our proposed HGATs consist of two modules: attentive vertex aggregation module and attentive hyperedge aggregation module. These two modules can implicitly assign different aggregation weights to different connected hyperedge/vertex to characterize the complex relations among data. We stack these modules to pass the messages between the hyperedges and vertices to refine the vertex/hyperedge features. Experimental results on the ModelNet40 and NTU2012 datasets show that our proposed HGATs can achieve superior performance for the visual object recognition tasks. Furthermore, we employ our HGAT for multi-view representation learning and better object classification results are achieved.
引用
收藏
页码:1560 / 1565
页数:6
相关论文
共 50 条
  • [31] Stability and Generalization of Hypergraph Collaborative Networks
    Michael K. Ng
    Hanrui Wu
    Andy Yip
    Machine Intelligence Research, 2024, 21 : 184 - 196
  • [32] Representations of hypergraph states with neural networks*
    Yang, Ying
    Cao, Huaixin
    COMMUNICATIONS IN THEORETICAL PHYSICS, 2021, 73 (10)
  • [33] Stability and Generalization of Hypergraph Collaborative Networks
    Ng, Michael K.
    Wu, Hanrui
    Yip, Andy
    MACHINE INTELLIGENCE RESEARCH, 2024, 21 (01) : 184 - 196
  • [34] Sparse and Local Networks for Hypergraph Reasoning
    Xiao, Guangxuan
    Kaelbling, Leslie Pack
    Wu, Jiajun
    Mao, Jiayuan
    LEARNING ON GRAPHS CONFERENCE, VOL 198, 2022, 198
  • [35] Representations of hypergraph states with neural networks
    杨莹
    曹怀信
    CommunicationsinTheoreticalPhysics, 2021, 73 (10) : 99 - 108
  • [36] Disintegrate hypergraph networks by attacking hyperedge
    Peng, Hao
    Qian, Cheng
    Zhao, Dandan
    Zhong, Ming
    Ling, Xianwen
    Wang, Wei
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (07) : 4679 - 4685
  • [37] Hypergraph Neural Networks with Logic Clauses
    Gandarela de Souza, Joao Pedro
    Zaverucha, Gerson
    Garcez, Artur S. d'Avila
    2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024, 2024,
  • [38] HYPERGRAPH CENTRALITY METRICS FOR SOCIAL NETWORKS
    Gopalakrishnan, Sathyanarayanan
    Ravi, Vignesh
    Venkatraman, Swaminathan
    TWMS JOURNAL OF APPLIED AND ENGINEERING MATHEMATICS, 2023, 13 : 445 - 455
  • [39] A hypergraph model of social tagging networks
    Zhang, Zi-Ke
    Liu, Chuang
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2010,
  • [40] Context-embedded hypergraph attention network and self-attention for session recommendation
    Zhang, Zhigao
    Zhang, Hongmei
    Zhang, Zhifeng
    Wang, Bin
    SCIENTIFIC REPORTS, 2024, 14 (01):