Adaptive Riemannian stochastic gradient descent and reparameterization for Gaussian mixture model fitting

被引:0
|
作者
Ji, Chunlin [1 ]
Fu, Yuhao [1 ,2 ]
He, Ping [3 ]
机构
[1] Kuang Chi Inst Adv Technol, Shenzhen, Peoples R China
[2] Origin Artificial Intelligence Technol Co, Shenzhen, Peoples R China
[3] HeGuangLiangZi Tech, Shenzhen, Peoples R China
来源
ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222 | 2023年 / 222卷
基金
国家重点研发计划;
关键词
Gaussian mixture model; Reparameterization; Symmetric positive definite matrix manifold; Riemannian stochastic gradient descent; Riemannian Adam algorithm; EM ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent advances in manifold optimization for the Gaussian mixture model (GMM) have gained increasing interest. In this work, instead of directly addressing the manifold optimization on covariance matrices of GMM, we consider the GMM fitting as an optimization of the density function over a statistical manifold and seek the natural gradient to speed up the optimization process. We present an upper bound for the Kullback-Leibler (KL) divergence between two GMMs and obtain simple closed-form expressions for the natural gradients. With the natural gradients, we then apply the Riemannian stochastic gradient descent (RSGD) algorithm to optimize covariance matrices on a symmetric and positive definite (SPD) matrix manifold. We further propose a Riemannian Adam (RAdam) algorithm that extends the momentum method and adaptive learning in the Euclidean space to the SPD manifold space. Extensive simulations show that the proposed algorithms scale well to high-dimensional large-scale datasets and outperform expectation maximization (EM) algorithms in fitted log-likelihood.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Stochastic parallel gradient descent algorithm for adaptive optics system
    Ma H.
    Zhang P.
    Zhang J.
    Fan C.
    Wang Y.
    Qiangjiguang Yu Lizishu/High Power Laser and Particle Beams, 2010, 22 (06): : 1206 - 1210
  • [32] STATISTICAL INFERENCE FOR MODEL PARAMETERS IN STOCHASTIC GRADIENT DESCENT
    Chen, Xi
    Lee, Jason D.
    Tong, Xin T.
    Zhang, Yichen
    ANNALS OF STATISTICS, 2020, 48 (01): : 251 - 273
  • [33] Accelerated and Unaccelerated Stochastic Gradient Descent in Model Generality
    D. M. Dvinskikh
    A. I. Tyurin
    A. V. Gasnikov
    C. C. Omel’chenko
    Mathematical Notes, 2020, 108 : 511 - 522
  • [34] Accelerated and Unaccelerated Stochastic Gradient Descent in Model Generality
    Dvinskikh, D. M.
    Tyurin, A., I
    Gasnikov, A., V
    Omel'chenko, C. C.
    MATHEMATICAL NOTES, 2020, 108 (3-4) : 511 - 522
  • [35] Adaptive Gradient Estimation Stochastic Parallel Gradient Descent Algorithm for Laser Beam Cleanup
    Ma, Shiqing
    Yang, Ping
    Lai, Boheng
    Su, Chunxuan
    Zhao, Wang
    Yang, Kangjian
    Jin, Ruiyan
    Cheng, Tao
    Xu, Bing
    PHOTONICS, 2021, 8 (05)
  • [36] Adaptive stochastic parallel gradient descent approach for efficient fiber coupling
    Hu, Qintao
    Zhen, Liangli
    Yao, Mao
    Zhu, Shiwei
    Zhou, Xi
    Zhou, Guozhong
    OPTICS EXPRESS, 2020, 28 (09) : 13141 - 13154
  • [37] Adaptive Distributed Stochastic Gradient Descent for Minimizing Delay in the Presence of Stragglers
    Hanna, Serge Kas
    Bitar, Rawad
    Parag, Parimal
    Dasari, Venkat
    El Rouayheb, Salim
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4262 - 4266
  • [38] Parallel Fractional Stochastic Gradient Descent With Adaptive Learning for Recommender Systems
    Elahi, Fatemeh
    Fazlali, Mahmood
    Malazi, Hadi Tabatabaee
    Elahi, Mehdi
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2024, 35 (03) : 470 - 483
  • [39] MindTheStep-AsyncPSGD: Adaptive Asynchronous Parallel Stochastic Gradient Descent
    Backstrom, Karl
    Papatriantafilou, Marina
    Tsigas, Philippas
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 16 - 25
  • [40] aSGD: Stochastic Gradient Descent with Adaptive Batch Size for Every Parameter
    Shi, Haoze
    Yang, Naisen
    Tang, Hong
    Yang, Xin
    MATHEMATICS, 2022, 10 (06)