Mechanism for feature learning in neural networks and backpropagation-free machine learning models

被引:12
|
作者
Radhakrishnan, Adityanarayanan [1 ,2 ]
Beaglehole, Daniel [3 ]
Pandit, Parthe [4 ,5 ]
Belkin, Mikhail [3 ,5 ]
机构
[1] Harvard Sch Engn & Appl Sci, Cambridge, MA 02138 USA
[2] Broad Inst MIT & Harvard, Cambridge, MA 02142 USA
[3] Univ Calif San Diego, Comp Sci & Engn, La Jolla, CA 92093 USA
[4] Indian Inst Technol, Ctr Machine Intelligence & Data Sci, Mumbai 400076, India
[5] Univ Calif San Diego, Halicioglu Data Sci Inst, La Jolla, CA 92093 USA
基金
美国国家科学基金会;
关键词
REGRESSION;
D O I
10.1126/science.adi5639
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Understanding how neural networks learn features, or relevant patterns in data, for prediction is necessary for their reliable use in technological and scientific applications. In this work, we presented a unifying mathematical mechanism, known as average gradient outer product (AGOP), that characterized feature learning in neural networks. We provided empirical evidence that AGOP captured features learned by various neural network architectures, including transformer-based language models, convolutional networks, multilayer perceptrons, and recurrent neural networks. Moreover, we demonstrated that AGOP, which is backpropagation-free, enabled feature learning in machine learning models, such as kernel machines, that a priori could not identify task-specific features. Overall, we established a fundamental mechanism that captured feature learning in neural networks and enabled feature learning in general machine learning models.
引用
收藏
页码:1461 / 1467
页数:7
相关论文
共 50 条
  • [1] Backpropagation-free Graph Neural Networks
    Pasa, Luca
    Navarin, Nicolo
    Erb, Wolfgang
    Sperduti, Alessandro
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 388 - 397
  • [2] BAFFLE: A Baseline of Backpropagation-Free Federated Learning
    Feng, Haozhe
    Pang, Tianyu
    Du, Chao
    Chen, Wei
    Yan, Shuicheng
    Lin, Min
    COMPUTER VISION - ECCV 2024, PT LXXV, 2025, 15133 : 89 - 109
  • [3] Backpropagation-free training of deep physical neural networks
    Momeni, Ali
    Rahmani, Babak
    Mallejac, Matthieu
    del Hougne, Philipp
    Fleury, Romain
    SCIENCE, 2023, 382 (6676) : 1297 - 1303
  • [4] Emergent Self-Adaptation in an Integrated Photonic Neural Network for Backpropagation-Free Learning
    Lugnan, Alessio
    Aggarwal, Samarth
    Brueckerhoff-Plueckelmann, Frank
    Wright, C. David
    Pernice, Wolfram H. P.
    Bhaskaran, Harish
    Bienstman, Peter
    ADVANCED SCIENCE, 2025, 12 (02)
  • [5] Monadic Pavlovian associative learning in a backpropagation-free photonic network
    Tan, James Y. S.
    Cheng, Zengguang
    Feldmann, Johannes
    Li, Xuan
    Youngblood, Nathan
    Ali, Utku E.
    Wright, C. David
    Pernice, Wolfram H. P.
    Bhaskaran, Harish
    OPTICA, 2022, 9 (07): : 792 - 802
  • [6] Backpropagation-Free Deep Learning with Recursive Local Representation Alignment
    Ororbia, Alexander G.
    Mali, Ankur
    Kifer, Daniel
    Giles, C. Lee
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9327 - 9335
  • [7] A unified framework for backpropagation-free soft and hard gated graph neural networks
    Luca Pasa
    Nicolò Navarin
    Wolfgang Erb
    Alessandro Sperduti
    Knowledge and Information Systems, 2024, 66 : 2393 - 2416
  • [8] A unified framework for backpropagation-free soft and hard gated graph neural networks
    Pasa, Luca
    Navarin, Nicolo
    Erb, Wolfgang
    Sperduti, Alessandro
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (04) : 2393 - 2416
  • [9] Learning quantum-state feedback control with backpropagation-free stochastic optimization
    Evans, Ethan N.
    Wang, Ziyi
    Frim, Adam G.
    DeWeese, Michael R.
    Theodorou, Evangelos A.
    PHYSICAL REVIEW A, 2022, 106 (05)
  • [10] Quantum Neural Machine Learning: Backpropagation and Dynamics
    Goncalves, Carlos Pedro
    NEUROQUANTOLOGY, 2017, 15 (01) : 22 - 41