Convergence of Distributed Asynchronous Learning Vector Quantization Algorithms

被引:0
|
作者
Patra, Benoit [1 ,2 ]
机构
[1] Univ Paris 06, LSTA, F-75252 Paris 05, France
[2] LOKAD SAS, F-75017 Paris, France
关键词
k-means; vector quantization; distributed; asynchronous; stochastic optimization; scalability; distributed consensus; TRAINING DISTORTION; CONSISTENCY; THEOREM; RATES;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Motivated by the problem of effectively executing clustering algorithms on very large data sets, we address a model for large scale distributed clustering methods. To this end, we briefly recall some standards on the quantization problem and some results on the almost sure convergence of the competitive learning vector quantization (CLVQ) procedure. A general model for linear distributed asynchronous algorithms well adapted to several parallel computing architectures is also discussed. Our approach brings together this scalable model and the CLVQ algorithm, and we call the resulting technique the distributed asynchronous learning vector quantization algorithm (DALVQ). An in-depth analysis of the almost sure convergence of the DALVQ algorithm is performed. A striking result is that we prove that the multiple versions of the quantizers distributed among the processors in the parallel architecture asymptotically reach a consensus almost surely. Furthermore, we also show that these versions converge almost surely towards the same nearly optimal value for the quantization criterion.
引用
收藏
页码:3431 / 3466
页数:36
相关论文
共 50 条
  • [41] Optimal Convergence for Distributed Learning with Stochastic Gradient Methods and Spectral Algorithms
    Lin, Junhong
    Cevher, Volkan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [42] Soft learning vector quantization and clustering algorithms based on non-Euclidean norms: Multinorm algorithms
    Karayiannis, NB
    Randolph-Gips, MM
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (01): : 89 - 102
  • [43] On the Convergence of Distributed Subgradient Methods under Quantization
    Doan, Thinh T.
    Maguluri, Siva Theja
    Romberg, Justin
    2018 56TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2018, : 567 - 574
  • [44] ON THE RATE OF CONVERGENCE OF A DISTRIBUTED ASYNCHRONOUS ROUTING ALGORITHM
    LUO, ZQ
    TSENG, P
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1994, 39 (05) : 1123 - 1129
  • [45] Alternative learning vector quantization
    Wu, KL
    Yang, MS
    PATTERN RECOGNITION, 2006, 39 (03) : 351 - 362
  • [46] Learning Vector Quantization networks
    Matera, F
    SUBSTANCE USE & MISUSE, 1998, 33 (02) : 271 - 282
  • [47] Generalized learning vector quantization
    Sato, A
    Yamada, K
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 423 - 429
  • [48] Regression Learning Vector Quantization
    Grbovic, Mihajlo
    Vucetic, Slobodan
    2009 9TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, 2009, : 788 - 793
  • [49] Soft learning vector quantization
    Seo, S
    Obermayer, K
    NEURAL COMPUTATION, 2003, 15 (07) : 1589 - 1604
  • [50] Learning vector quantization: A review
    Karayiannis, Nicolaos B.
    International Journal of Smart Engineering System Design, 1997, 1 (01): : 33 - 58