The information bottleneck problem and its applications in machine learning

被引:82
|
作者
Goldfeld Z. [1 ]
Polyanskiy Y. [2 ]
机构
[1] The Electrical and Computer Engineering Department, Cornell University, Ithaca, 14850, NY
[2] The Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, 02139, MA
关键词
Deep learning; Information bottleneck; Machine learning; Mutual information; Neural networks;
D O I
10.1109/JSAIT.2020.2991561
中图分类号
学科分类号
摘要
Inference capabilities of machine learning (ML) systems skyrocketed in recent years, now playing a pivotal role in various aspect of society. The goal in statistical learning is to use data to obtain simple algorithms for predicting a random variable Y from a correlated observation X. Since the dimension of X is typically huge, computationally feasible solutions should summarize it into a lower-dimensional feature vector T, from which Y is predicted. The algorithm will successfully make the prediction if T is a good proxy of Y, despite the said dimensionality-reduction. A myriad of ML algorithms (mostly employing deep learning (DL)) for finding such representations T based on real-world data are now available. While these methods are effective in practice, their success is hindered by the lack of a comprehensive theory to explain it. The information bottleneck (IB) theory recently emerged as a bold information-theoretic paradigm for analyzing DL systems. Adopting mutual information as the figure of merit, it suggests that the best representation T should be maximally informative about Y while minimizing the mutual information with X. In this tutorial we survey the information-theoretic origins of this abstract principle, and its recent impact on DL. For the latter, we cover implications of the IB problem on DL theory, as well as practical algorithms inspired by it. Our goal is to provide a unified and cohesive description. A clear view of current knowledge is important for further leveraging IB and other information-theoretic ideas to study DL models. © 2020 IEEE.
引用
收藏
页码:19 / 38
页数:19
相关论文
共 50 条
  • [31] Information Bottleneck Approach to Spatial Attention Learning
    Lai, Qiuxia
    Li, Yu
    Zeng, Ailing
    Liu, Minhao
    Sun, Hanqiu
    Xu, Qiang
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 779 - 785
  • [32] Information Bottleneck in Deep Learning - A Semiotic Approach
    Musat, B.
    Andonie, R.
    INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, 2022, 17 (01)
  • [33] Graph Structure Learning with Variational Information Bottleneck
    Sun, Qingyun
    Li, Jianxin
    Peng, Hao
    Wu, Jia
    Fu, Xingcheng
    Ji, Cheng
    Yu, Philip S.
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4165 - 4174
  • [34] Applying the information bottleneck to statistical relational learning
    Fabrizio Riguzzi
    Nicola Di Mauro
    Machine Learning, 2012, 86 : 89 - 114
  • [35] Disentangled Representation Learning With Transmitted Information Bottleneck
    Dang, Zhuohang
    Luo, Minnan
    Jia, Chengyou
    Dai, Guang
    Wang, Jihong
    Chang, Xiaojun
    Wang, Jingdong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (12) : 13297 - 13310
  • [36] Learning to Abstract with Nonparametric Variational Information Bottleneck
    Behjati, Melika
    Fehr, Fabio
    Henderson, James
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 1576 - 1586
  • [37] Federated Learning via Disentangled Information Bottleneck
    Uddin, Md Palash
    Xiang, Yong
    Lu, Xuequan
    Yearwood, John
    Gao, Longxiang
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (03) : 1874 - 1889
  • [38] Information Bottleneck Learning Using Privileged Information for Visual Recognition
    Motiian, Saeid
    Piccirilli, Marco
    Adjeroh, Donald A.
    Doretto, Gianfranco
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 1496 - 1505
  • [39] Learning Optimal Primary Capsules by Information Bottleneck
    Hu, Ming-fei
    Liu, Jian-wei
    Li, Wei-min
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 519 - 528
  • [40] Learning Optimal Representations with the Decodable Information Bottleneck
    Dubois, Yann
    Kiela, Douwe
    Schwab, David J.
    Vedantam, Ramakrishna
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33