Distributed Consensus Reduced Support Vector Machine

被引:0
|
作者
Chen, Hsiang-Hsuan [1 ]
Lee, Yuh-Jye [2 ]
机构
[1] Natl Chiao Tung Univ, Dept Appl Math, Hsinchu, Taiwan
[2] Acad Sinica, Res Ctr Informat Technol Innovat, Taipei, Taiwan
来源
2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA) | 2019年
关键词
Distributed Machine Learning; Privacy Preserving; Large-Scale Machine Learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nowadays, machine learning performs astonishingly well in many different fields. In general, the more data we have, our machine learning methods will show better results. However, in many situations, the data owners may not want to or not allow to share their data because of legal issues or privacy concerns. However, if we can pool all the data together as the training data 14 the machine learning task we will have a better result. In the other situation, we encounter an extremely large dataset, which is difficult to store in a single machine. We may utilize more computing units to solve it. To deal with these two problems, we propose the distributed consensus reduced support vector machine (DCRSVM), which is a nonlinear model for binary classification. We apply the ADMM, Alternating Direction Method of Multipliers, to solve the DCRSVM. In each iteration, the local worker will update their model by incorporating the information shared by the master. The local workers only share their models in each iteration but never share their data. The master will fuse the local models reported by the local workers. At the end, the master will generate the consensus model that almost identical to the model generated by pooling all data together. Pooling all data together is not allowed in many real world applications.
引用
收藏
页码:5718 / 5727
页数:10
相关论文
共 50 条
  • [21] xSVM: Scalable Distributed Kernel Support Vector Machine Training
    Shah, Ruchi
    Zhang, Shaoshuai
    Lin, Ying
    Wu, Panruo
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 155 - 164
  • [22] Research on Distributed TBL Support Vector Machine Based on MapReduce
    Hu, Jianwei
    Sheng, Aiping
    Zhuge Huixiang
    PROCEEDINGS OF THE 2015 INTERNATIONAL CONFERENCE ON AUTOMATION, MECHANICAL CONTROL AND COMPUTATIONAL ENGINEERING, 2015, 124 : 954 - 959
  • [23] A reduced support vector machine approach for interval regression analysis
    Huang, Chia-Hui
    INFORMATION SCIENCES, 2012, 217 : 56 - 64
  • [24] Consensus Proximal Support Vector Machine for Classification Problems with Sparse Solutions
    Bai Y.-Q.
    Shen Y.-J.
    Shen K.-J.
    Journal of the Operations Research Society of China, 2014, 2 (1) : 57 - 74
  • [25] Distributed Regression over Sensor Networks: An Support Vector Machine Approach
    Gu, Dongbing
    Wang, Zongyao
    2008 IEEE/RSJ INTERNATIONAL CONFERENCE ON ROBOTS AND INTELLIGENT SYSTEMS, VOLS 1-3, CONFERENCE PROCEEDINGS, 2008, : 3286 - 3291
  • [26] Distributed Denial of Service Attacks Detection Using Support Vector Machine
    Ahmad, Iftikhar
    Abdullah, Azween B.
    Alghamdi, Abdullah S.
    Hussain, Muhammad
    INFORMATION-AN INTERNATIONAL INTERDISCIPLINARY JOURNAL, 2011, 14 (01): : 127 - 134
  • [27] A Distributed Support Vector Machine Learning Over Wireless Sensor Networks
    Kim, Woojin
    Stankovic, Milos S.
    Johansson, Karl H.
    Kim, H. Jin
    IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (11) : 2599 - 2611
  • [28] Distributed Training of Support Vector Machine on a Multiple-FPGA System
    Dass, Jyotikrishna
    Narawane, Yashwardhan
    Mahapatra, Rabi N.
    Sarin, Vivek
    IEEE TRANSACTIONS ON COMPUTERS, 2020, 69 (07) : 1015 - 1026
  • [29] A reduced universum twin support vector machine for class imbalance learning
    Richhariya, B.
    Tanveer, M.
    PATTERN RECOGNITION, 2020, 102 (102)
  • [30] A novel reduced support vector machine on Morlet wavelet kernel function
    Wu, FF
    Zhao, YL
    PROCEEDINGS OF THE 11TH JOINT INTERNATIONAL COMPUTER CONFERENCE, 2005, : 348 - 351