Distributed Consensus Reduced Support Vector Machine

被引:0
|
作者
Chen, Hsiang-Hsuan [1 ]
Lee, Yuh-Jye [2 ]
机构
[1] Natl Chiao Tung Univ, Dept Appl Math, Hsinchu, Taiwan
[2] Acad Sinica, Res Ctr Informat Technol Innovat, Taipei, Taiwan
关键词
Distributed Machine Learning; Privacy Preserving; Large-Scale Machine Learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nowadays, machine learning performs astonishingly well in many different fields. In general, the more data we have, our machine learning methods will show better results. However, in many situations, the data owners may not want to or not allow to share their data because of legal issues or privacy concerns. However, if we can pool all the data together as the training data 14 the machine learning task we will have a better result. In the other situation, we encounter an extremely large dataset, which is difficult to store in a single machine. We may utilize more computing units to solve it. To deal with these two problems, we propose the distributed consensus reduced support vector machine (DCRSVM), which is a nonlinear model for binary classification. We apply the ADMM, Alternating Direction Method of Multipliers, to solve the DCRSVM. In each iteration, the local worker will update their model by incorporating the information shared by the master. The local workers only share their models in each iteration but never share their data. The master will fuse the local models reported by the local workers. At the end, the master will generate the consensus model that almost identical to the model generated by pooling all data together. Pooling all data together is not allowed in many real world applications.
引用
收藏
页码:5718 / 5727
页数:10
相关论文
共 50 条
  • [1] SUPPORT VECTOR MACHINE FOR DISTRIBUTED CLASSIFICATION: A DYNAMIC CONSENSUS APPROACH
    Wang, Dongli
    Li, Jianxun
    Zhou, Yan
    2009 IEEE/SP 15TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2009, : 752 - +
  • [2] Distributed Support Vector Machine Based on Distributed Loss
    Ma, Yuefeng
    Wang, Mengwei
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 69 - 75
  • [3] Consensus-based distributed support vector machines
    Forero, Pedro A.
    Cano, Alfonso
    Giannakis, Georgios B.
    Journal of Machine Learning Research, 2010, 11 : 1663 - 1707
  • [4] Consensus-Based Distributed Support Vector Machines
    Forero, Pedro A.
    Cano, Alfonso
    Giannakis, Georgios B.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 1663 - 1707
  • [5] Distributed Inference for Linear Support Vector Machine
    Wang, Xiaozhou
    Yang, Zhuoyi
    Chen, Xi
    Liu, Weidong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [6] Distributed inference for linear support vector machine
    Wang, Xiaozhou
    Yang, Zhuoyi
    Chen, Xi
    Liu, Weidong
    Journal of Machine Learning Research, 2019, 20
  • [7] A novel parallel reduced support vector machine
    Wu, FF
    Zhao, YL
    Jiang, ZF
    ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS, 2005, 3610 : 608 - 618
  • [8] Consensus-Based Distributed Kernel One-class Support Vector Machine for Anomaly Detection
    Wang, Tianyao
    He, Fan
    Yang, Ruikai
    Ye, Zhixing
    Huang, Xiaolin
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [9] Consensus-Based Distributed Linear Support Vector Machines
    Forero, Pedro A.
    Cano, Alfonso
    Giannakis, Georgios B.
    PROCEEDINGS OF THE 9TH ACM/IEEE INTERNATIONAL CONFERENCE ON INFORMATION PROCESSING IN SENSOR NETWORKS, 2010, : 35 - 46
  • [10] Simplified support vector machine based on reduced vector set method
    College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
    Ruan Jian Xue Bao, 2007, 11 (2719-2727):