MDP: Privacy-Preserving GNN Based on Matrix Decomposition and Differential Privacy

被引:1
|
作者
Xu, Wanghan [1 ]
Shi, Bin [1 ]
Zhang, Jiqiang [1 ]
Feng, Zhiyuan [2 ]
Pan, Tianze [3 ]
Dong, Bo [4 ]
机构
[1] Xi An Jiao Tong Univ, Sch Comp Sci & Technol, Xian, Peoples R China
[2] Xi An Jiao Tong Univ, Qian Xuesen Coll, Xian, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Phys, Xian, Peoples R China
[4] Xi An Jiao Tong Univ, Sch Distance Educ, Xian, Peoples R China
基金
中国博士后科学基金; 美国国家科学基金会;
关键词
privacy-preserving; topological secret sharing; matrix decomposition; distributed machine learning;
D O I
10.1109/JCC59055.2023.00011
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, graph neural networks (GNN) have developed rapidly in various fields, but the high computational consumption of its model training often discourages some graph owners who want to train GNN models but lack computing power. Therefore, these data owners often cooperate with external calculators during the model training process, which will raise critical severe privacy concerns. Protecting private information in graph, however, is difficult due to the complex graph structure consisting of node features and edges. To solve this problem, we propose a new privacy-preserving GNN named MDP based on matrix decomposition and differential privacy (DP), which allows external calculators train GNN models without knowing the original data. Specifically, we first introduce the concept of topological secret sharing (TSS), and design a novel matrix decomposition method named eigenvalue selection (ES) according to TSS, which can preserve the message passing ability of adjacency matrix while hiding edge information. We evaluate the feasibility and performance of our model through extensive experiments, which demonstrates that MDP model achieves accuracy comparable to the original model, with practically affordable overhead.
引用
收藏
页码:38 / 45
页数:8
相关论文
共 50 条
  • [31] Privacy-Preserving Federated Singular Value Decomposition
    Liu, Bowen
    Pejo, Balazs
    Tang, Qiang
    APPLIED SCIENCES-BASEL, 2023, 13 (13):
  • [32] Privacy-Preserving Decentralised Singular Value Decomposition
    Liu, Bowen
    Tang, Qiang
    INFORMATION AND COMMUNICATIONS SECURITY (ICICS 2019), 2020, 11999 : 703 - 721
  • [33] A Privacy-Preserving Asynchronous Averaging Algorithm based on State Decomposition
    Calis, Metin
    Heusdens, Richard
    Hendriks, Richard C.
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 2115 - 2119
  • [34] Privacy-preserving mechanism for mixed data clustering with local differential privacy
    Yuan, Liujie
    Zhang, Shaobo
    Zhu, Gengming
    Alinani, Karim
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (19):
  • [35] Privacy-Preserving Genomic Statistical Analysis Under Local Differential Privacy
    Yamamoto, Akito
    Shibuya, Tetsuo
    DATA AND APPLICATIONS SECURITY AND PRIVACY XXXVII, DBSEC 2023, 2023, 13942 : 40 - 48
  • [36] A Framework for Privacy-Preserving in IoV Using Federated Learning With Differential Privacy
    Adnan, Muhammad
    Syed, Madiha Haider
    Anjum, Adeel
    Rehman, Semeen
    IEEE ACCESS, 2025, 13 : 13507 - 13521
  • [37] A Pragmatic Privacy-Preserving Deep Learning Framework Satisfying Differential Privacy
    Dang T.K.
    Tran-Truong P.T.
    SN Computer Science, 5 (1)
  • [38] Differential Privacy in Privacy-Preserving Big Data and Learning: Challenge and Opportunity
    Jiang, Honglu
    Gao, Yifeng
    Sarwar, S. M.
    GarzaPerez, Luis
    Robin, Mahmudul
    SILICON VALLEY CYBERSECURITY CONFERENCE, SVCC 2021, 2022, 1536 : 33 - 44
  • [39] Efficient privacy-preserving classification construction model with differential privacy technology
    Lin Zhang
    Yan Liu
    Ruchuan Wang
    Xiong Fu
    Qiaomin Lin
    Journal of Systems Engineering and Electronics, 2017, 28 (01) : 170 - 178
  • [40] PPeFL: Privacy-Preserving Edge Federated Learning With Local Differential Privacy
    Wang, Baocang
    Chen, Yange
    Jiang, Hang
    Zhao, Zhen
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (17) : 15488 - 15500