Tightening Mutual Information Based Bounds on Generalization Error

被引:0
|
作者
Bu, Yuheng [1 ]
Zou, Shaofeng [2 ]
Veeravalli, Venugopal V. [1 ]
机构
[1] Univ Illinois, Urbana, IL 61801 USA
[2] SUNY Buffalo, Buffalo, NY USA
关键词
STABILITY;
D O I
10.1109/isit.2019.8849590
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A mutual information based upper bound on the generalization error of a supervised learning algorithm is derived in this paper. The bound is constructed in terms of the mutual information between each individual training sample and the output of the learning algorithm, which requires weaker conditions on the loss function, but provides a tighter characterization of the generalization error than existing studies. Examples are further provided to demonstrate that the bound derived in this paper is tighter, and has a broader range of applicability. Application to noisy and iterative algorithms, e.g., stochastic gradient Langevin dynamics (SGLD), is also studied, where the constructed bound provides a tighter characterization of the generalization error than existing results.
引用
收藏
页码:587 / 591
页数:5
相关论文
共 50 条
  • [1] Tightening mutual information-based bounds on generalization error
    Bu Y.
    Zou S.
    Veeravalli V.V.
    IEEE Journal on Selected Areas in Information Theory, 2020, 1 (01): : 121 - 130
  • [2] Chaining Mutual Information and Tightening Generalization Bounds
    Asadi, Amir R.
    Abbe, Emmanuel
    Verdu, Sergio
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Strengthened Information-theoretic Bounds on the Generalization Error
    Issa, Ibrahim
    Esposito, Amedeo Roberto
    Gastpar, Michael
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 582 - 586
  • [4] Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms
    Haghifam, Mahdi
    Negrea, Jeffrey
    Khisti, Ashish
    Roy, Daniel M.
    Dziugaite, Gintare Karolina
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [5] Renyi Divergence Based Bounds on Generalization Error
    Modak, Eeshan
    Asnani, Himanshu
    Prabhakaran, Vinod M.
    2021 IEEE INFORMATION THEORY WORKSHOP (ITW), 2021,
  • [6] Individually Conditional Individual Mutual Information Bound on Generalization Error
    Zhou, Ruida
    Tian, Chao
    Liu, Tie
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (05) : 3304 - 3316
  • [7] Individually Conditional Individual Mutual Information Bound on Generalization Error
    Zhou, Ruida
    Tian, Chao
    Liu, Tie
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 670 - 675
  • [9] GENERALIZATION OF MUTUAL INFORMATION
    BLACHMAN, NM
    PROCEEDINGS OF THE INSTITUTE OF RADIO ENGINEERS, 1961, 49 (08): : 1331 - &
  • [10] Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms
    Aminian, Gholamali
    Toni, Laura
    Rodrigues, Miguel R. D.
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 682 - 687