Convergence Rates of Gradient Descent and MM Algorithms for Bradley-Terry Models

被引:0
|
作者
Vojnovic, Milan [1 ]
Yun, Se-Young [2 ]
Zhou, Kaifang [1 ]
机构
[1] LSE, London, England
[2] Korea Adv Inst Sci & Technol, Daejeon, South Korea
基金
新加坡国家研究基金会;
关键词
INCOMPLETE BLOCK DESIGNS; RANK AGGREGATION; OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present tight convergence rate bounds for gradient descent and MM algorithms for maximum likelihood (ML) estimation and maximum a posteriori probability (MAP) estimation of a popular Bayesian inference method, for Bradley-Terry models of ranking data. Our results show that MM algorithms have the same convergence rate, up to a constant factor, as gradient descent algorithms with optimal constant step size. For the ML estimation objective, the convergence is linear with the rate crucially determined by the algebraic connectivity of the matrix of item pair co-occurrences in observed comparison data. For the MAP estimation objective, we show that the convergence rate is also linear, with the rate determined by a parameter of the prior distribution in a way that can make convergence arbitrarily slow for small values of this parameter. The limit of small values of this parameter corresponds to a flat, non-informative prior distribution.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
    Apidopoulos, Vassilis
    Aujol, Jean-Francois
    Dossal, Charles
    Rondepierre, Aude
    MATHEMATICAL PROGRAMMING, 2021, 187 (1-2) : 151 - 193
  • [43] Impact of Mathematical Norms on Convergence of Gradient Descent Algorithms for Deep Neural Networks Learning
    Cai, Linzhe
    Yu, Xinghuo
    Li, Chaojie
    Eberhard, Andrew
    Lien Thuy Nguyen
    Chuong Thai Doan
    AI 2022: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13728 : 131 - 144
  • [44] Convergence Rates for the Stochastic Gradient Descent Method for Non-Convex Objective Functions
    Fehrman, Benjamin
    Gess, Benjamin
    Jentzen, Arnulf
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [45] Convergence rates for the stochastic gradient descent method for non-convex objective functions
    Fehrman, Benjamin
    Gess, Benjamin
    Jentzen, Arnulf
    Journal of Machine Learning Research, 2020, 21
  • [46] Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model
    Berthier, Raphael
    Bach, Francis
    Gaillard, Pierre
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [48] Convergence rates of accelerated proximal gradient algorithms under independent noise
    Sun, Tao
    Barrio, Roberto
    Jiang, Hao
    Cheng, Lizhi
    NUMERICAL ALGORITHMS, 2019, 81 (02) : 631 - 654
  • [49] Convergence rates of accelerated proximal gradient algorithms under independent noise
    Tao Sun
    Roberto Barrio
    Hao Jiang
    Lizhi Cheng
    Numerical Algorithms, 2019, 81 : 631 - 654
  • [50] Forecasting with imperfect models, dynamically constrained inverse problems, and gradient descent algorithms
    Judd, Kevin
    PHYSICA D-NONLINEAR PHENOMENA, 2008, 237 (02) : 216 - 232