Information-Theoretical Foundations of Hebbian Learning

被引:0
|
作者
Gros, Claudius [1 ]
Echeveste, Rodrigo [1 ,2 ]
机构
[1] Goethe Univ Frankfurt, Inst Theoret Phys, Frankfurt, Germany
[2] Univ Cambridge, Dept Engn, Cambridge, England
关键词
Information theory; Hebbian learning; Stationarity principle;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural information processing includes the extraction of information present in the statistics of afferent signals. For this, the afferent synaptic weights w(j) are continuously adapted, changing in turn the distribution p theta(y) of the post-synaptic neural activity y. Here. denotes relevant neural parameters. The functional form of p theta(y) will hence continue to evolve as long as learning is on-going, becoming stationary only when learning is completed. This stationarity principle can be captured by the Fisher information F-theta = integral P theta(y) (partial derivative/partial derivative theta ln(p theta(y)))(2) dy, partial derivative/partial derivative theta Sigma w(j)partial derivative/partial derivative w(j) of the neural activity with respect to the afferent synaptic weights wj. It then follows, that Hebbian learning rules may be derived by minimizing F-theta. The precise functional form of the learning rules depends then on the shape of the transfer function y = g(x) relating the membrane potential x with the activity y. The learning rules derived from the stationarity principle are self-limiting ( runaway synaptic growth does not occur), performing a standard principal component analysis, whenever a direction in the space of input activities with a large variance is present. Generically, directions of input activities having a negative excess Kurtosis are preferred, making the rules suitable for ICA ( see figure). Moreover, when only the exponential foot of g is considered ( low activity regime), the standard Hebbian learning rule, without reversal, is recovered.
引用
收藏
页码:560 / 560
页数:1
相关论文
共 50 条
  • [31] An information-theoretical model for breast cancer detection
    Blokh, D.
    Zurgil, N.
    Stambler, I.
    Afrimzon, E.
    Shafran, Y.
    Korech, E.
    Sandbank, J.
    Deutsh, M.
    METHODS OF INFORMATION IN MEDICINE, 2008, 47 (04) : 322 - 327
  • [32] INFORMATION-THEORETICAL APPROACH TO A SYSTEM OF INTERACTING ELEMENTS
    TAKATSUJI, M
    BIOLOGICAL CYBERNETICS, 1975, 17 (04) : 207 - 210
  • [34] INFORMATION-THEORETICAL ENTROPY AS A MEASURE OF SEQUENCE VARIABILITY
    SHENKIN, PS
    ERMAN, B
    MASTRANDREA, LD
    PROTEINS-STRUCTURE FUNCTION AND GENETICS, 1991, 11 (04): : 297 - 313
  • [35] An information-theoretical perspective on weighted ensemble forecasts
    Weijs, Steven V.
    van de Giesen, Nick
    JOURNAL OF HYDROLOGY, 2013, 498 : 177 - 190
  • [36] An Information-Theoretical Approach for Calcium Signaling Specificity
    Martins, Teresa Vaz
    Hammelman, Jennifer
    Marinova, Sylvia
    Ding, Clara O.
    Morris, Richard J.
    IEEE TRANSACTIONS ON NANOBIOSCIENCE, 2019, 18 (01) : 93 - 100
  • [37] An Information-Theoretical Score of Dichotomous Precipitation Forecast
    Fekri, Majid
    Yau, M. K.
    MONTHLY WEATHER REVIEW, 2016, 144 (04) : 1633 - 1647
  • [38] Information-theoretical aspects of embodied artificial intelligence
    Sporns, O
    Pegors, TK
    EMBODIED ARTIFICIAL INTELLIGENCE, 2004, 3139 : 74 - 85
  • [39] An information-theoretical treatment of nonlocal PMD compensation
    Riccardi, Gabriele
    Kirby, Brian T.
    Antonelli, Cristian
    Brodsky, Michael
    ADVANCED OPTICAL TECHNIQUES FOR QUANTUM INFORMATION, SENSING, AND METROLOGY, 2020, 11295
  • [40] INFORMATION-THEORETICAL ASPECTS OF REPRESENTING PHYSICAL EVENTS
    SIMON, JC
    COMPTES RENDUS HEBDOMADAIRES DES SEANCES DE L ACADEMIE DES SCIENCES SERIE A, 1968, 267 (04): : 199 - &