Information-Theoretical Foundations of Hebbian Learning

被引:0
|
作者
Gros, Claudius [1 ]
Echeveste, Rodrigo [1 ,2 ]
机构
[1] Goethe Univ Frankfurt, Inst Theoret Phys, Frankfurt, Germany
[2] Univ Cambridge, Dept Engn, Cambridge, England
关键词
Information theory; Hebbian learning; Stationarity principle;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural information processing includes the extraction of information present in the statistics of afferent signals. For this, the afferent synaptic weights w(j) are continuously adapted, changing in turn the distribution p theta(y) of the post-synaptic neural activity y. Here. denotes relevant neural parameters. The functional form of p theta(y) will hence continue to evolve as long as learning is on-going, becoming stationary only when learning is completed. This stationarity principle can be captured by the Fisher information F-theta = integral P theta(y) (partial derivative/partial derivative theta ln(p theta(y)))(2) dy, partial derivative/partial derivative theta Sigma w(j)partial derivative/partial derivative w(j) of the neural activity with respect to the afferent synaptic weights wj. It then follows, that Hebbian learning rules may be derived by minimizing F-theta. The precise functional form of the learning rules depends then on the shape of the transfer function y = g(x) relating the membrane potential x with the activity y. The learning rules derived from the stationarity principle are self-limiting ( runaway synaptic growth does not occur), performing a standard principal component analysis, whenever a direction in the space of input activities with a large variance is present. Generically, directions of input activities having a negative excess Kurtosis are preferred, making the rules suitable for ICA ( see figure). Moreover, when only the exponential foot of g is considered ( low activity regime), the standard Hebbian learning rule, without reversal, is recovered.
引用
收藏
页码:560 / 560
页数:1
相关论文
共 50 条
  • [21] INFORMATION-THEORETICAL APPROACH TO JOSEPHSON TUNNELING
    ALIAGA, J
    CERDEIRA, HA
    PROTO, AN
    OTERO, D
    PHYSICAL REVIEW B, 1989, 40 (07): : 4375 - 4383
  • [22] INFORMATION-THEORETICAL ASPECTS OF QUANTUM MEASUREMENT
    PRUGOVECKI, E
    INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 1977, 16 (05) : 321 - 331
  • [23] Information-theoretical formulation of anyonic entanglement
    Kato, Kohtaro
    Furrer, Fabian
    Murao, Mio
    PHYSICAL REVIEW A, 2014, 90 (06):
  • [24] INFORMATION-THEORETICAL LINESHAPE - GENERAL CASE
    CZAJKOWSKI, GZ
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1973, 6 (07): : 906 - 914
  • [25] An Information-Theoretical Analysis of the Minimum Cost to Erase Information
    Matsuta, Tetsunao
    Uyematsu, Tomohiko
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2018, E101A (12): : 2099 - 2109
  • [26] INFORMATION-THEORETICAL METHODS - GERMAN - SCHWARZ,R
    MARGRAF, O
    PETERMANNS GEOGRAPHISCHE MITTEILUNGEN, 1985, 129 (03) : 213 - 213
  • [27] Information theoretical properties of a spiking neuron trained with Hebbian and STDP learning rules
    Chu, Dominique
    NATURAL COMPUTING, 2023,
  • [28] The Information-Theoretical Entropy of Some Quantum Oscillators
    Popov, D.
    Pop, N.
    Popov, M.
    Simon, S.
    TIM 2013 PHYSICS CONFERENCE, 2014, 1634 : 192 - 197
  • [29] Information-theoretical complexity for the hydrogenic abstraction reaction
    Esquivel, Rodolfo O.
    Molina-Espiritu, Moyocoyani
    Carlos Angulo, Juan
    Antolin, Juan
    Flores-Gallegos, Nelson
    Dehesa, Jesus S.
    MOLECULAR PHYSICS, 2011, 109 (19) : 2353 - 2365
  • [30] INFORMATION-THEORETICAL CHARACTERIZATION OF FUZZY RELATIONAL DATABASES
    BUCKLES, BP
    PETRY, FE
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1983, 13 (01): : 74 - 77