Efficient and Effective Graph Convolution Networks

被引:8
|
作者
Liu, Siwu [1 ]
Park, Ji Hwan [2 ]
Yoo, Shinjae [2 ]
机构
[1] SUNY Stony Brook, Stony Brook, NY 11790 USA
[2] Brookhaven Natl Lab, Upton, NY 11973 USA
关键词
Graph Embedding; Neural Networks;
D O I
10.1137/1.9781611976236.44
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph convolution is a generalization of the convolution operation from structured grid data to unstructured graph data. Because any type of data can be represented on a feature graph, graph convolution has been a powerful tool for modeling various types of data. However, such flexibility comes with a price: expensive time and space complexities. Even with state-of-the-art scalable graph convolution algorithms, it remains challenging to scale graph convolution for practical applications. Hence, we propose using Diverse Power Iteration Embeddings (DPIE) to construct scalable graph convolution neural networks. DPIE is an approximated spectral embedding with orders of magnitude faster speed that does not incur additional space complexity, resulting in efficient and effective graph convolution approximation. DPIE-based graph convolution avoids expensive convolution operation in the form of matrix-vector multiplication using the embedding of a lower dimension. At the same time, DPIE generates graphs implicitly, which dramatically reduces space cost when building graphs from unstructured data. The method is tested on various types of data. We also extend the graph convolution to extreme-scale data never-before studied in the graph convolution field. Experiment results show the scalability and effectiveness of DPIE-based graph convolution.
引用
收藏
页码:388 / 396
页数:9
相关论文
共 50 条
  • [1] Graph matching as a graph convolution operator for graph neural networks
    Martineau, Chloé
    Raveaux, Romain
    Conte, Donatello
    Venturini, Gilles
    Pattern Recognition Letters, 2021, 149 : 59 - 66
  • [2] Graph matching as a graph convolution operator for graph neural networks
    Martineau, Chloe
    Raveaux, Romain
    Conte, Donatello
    Venturini, Gilles
    PATTERN RECOGNITION LETTERS, 2021, 149 : 59 - 66
  • [3] ScaleGCN: Efficient and Effective Graph Convolution via Channel-Wise Scale Transformation
    Zhang, Tianqi
    Wu, Qitian
    Yan, Junchi
    Zhao, Yunan
    Han, Bing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4478 - 4490
  • [4] Clustering using graph convolution networks
    Al Jreidy, Maria
    Constantin, Joseph
    Dornaika, Fadi
    Hamad, Denis
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2024,
  • [5] Residual Hyperbolic Graph Convolution Networks
    Xue, Yangkai
    Dai, Jindou
    Lu, Zhipeng
    Wu, Yuwei
    Jia, Yunde
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16247 - 16254
  • [6] Graph Convolution Networks for Cell Segmentation
    Bahade, Sachin
    Edwards, Michael
    Xie, Xianghua
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS (ICPRAM), 2021, : 620 - 627
  • [7] Combining Graph and Recurrent Networks for Efficient and Effective Segment Tagging
    Montero, David
    Javier Yebes, J.
    LEARNING ON GRAPHS CONFERENCE, VOL 198, 2022, 198
  • [8] Dating Documents using Graph Convolution Networks
    Vashishth, Shikhar
    Dasgupta, Shib Sankar
    Ray, Swayambhu Nath
    Talukdar, Partha
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1605 - 1615
  • [9] Lorentzian Graph Convolution Networks for Collaborative Filtering
    Zhu, Zihong
    Zhang, Weiyu
    Guo, Xinchao
    Qiao, Xinxiao
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [10] Hierarchical Graph Convolution Networks for Traffic Forecasting
    Guo, Kan
    Hu, Yongli
    Sun, Yanfeng
    Qian, Sean
    Gao, Junbin
    Yin, Baocai
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 151 - 159