Information-Theoretical Foundations of Hebbian Learning

被引:0
|
作者
Gros, Claudius [1 ]
Echeveste, Rodrigo [1 ,2 ]
机构
[1] Goethe Univ Frankfurt, Inst Theoret Phys, Frankfurt, Germany
[2] Univ Cambridge, Dept Engn, Cambridge, England
关键词
Information theory; Hebbian learning; Stationarity principle;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural information processing includes the extraction of information present in the statistics of afferent signals. For this, the afferent synaptic weights w(j) are continuously adapted, changing in turn the distribution p theta(y) of the post-synaptic neural activity y. Here. denotes relevant neural parameters. The functional form of p theta(y) will hence continue to evolve as long as learning is on-going, becoming stationary only when learning is completed. This stationarity principle can be captured by the Fisher information F-theta = integral P theta(y) (partial derivative/partial derivative theta ln(p theta(y)))(2) dy, partial derivative/partial derivative theta Sigma w(j)partial derivative/partial derivative w(j) of the neural activity with respect to the afferent synaptic weights wj. It then follows, that Hebbian learning rules may be derived by minimizing F-theta. The precise functional form of the learning rules depends then on the shape of the transfer function y = g(x) relating the membrane potential x with the activity y. The learning rules derived from the stationarity principle are self-limiting ( runaway synaptic growth does not occur), performing a standard principal component analysis, whenever a direction in the space of input activities with a large variance is present. Generically, directions of input activities having a negative excess Kurtosis are preferred, making the rules suitable for ICA ( see figure). Moreover, when only the exponential foot of g is considered ( low activity regime), the standard Hebbian learning rule, without reversal, is recovered.
引用
收藏
页码:560 / 560
页数:1
相关论文
共 50 条
  • [41] On An Information-Theoretical Assessment of PET System Design
    Kao, Chien-Min
    2009 IEEE NUCLEAR SCIENCE SYMPOSIUM CONFERENCE RECORD, VOLS 1-5, 2009, : 3995 - 3998
  • [42] Information-Theoretical Analysis of Private Content Identification
    Voloshynovskiy, S.
    Koval, O.
    Beekhof, F.
    Farhadzadeh, F.
    Holotyak, T.
    2010 IEEE INFORMATION THEORY WORKSHOP (ITW), 2010,
  • [43] The entropy of mixing and assimilation: An information-theoretical perspective
    Ben-Naim, Arieh
    AMERICAN JOURNAL OF PHYSICS, 2006, 74 (12) : 1126 - 1135
  • [44] An Information-theoretical Approach to Semi-supervised Learning under Covariate-shift
    Aminian, Gholamali
    Abroshan, Mahed
    Khalili, Mohammad Mahdi
    Toni, Laura
    Rodrigues, Miguel R. D.
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [46] Partially reversible quantum operations and their information-theoretical properties
    Ban, M
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 2003, 36 (24): : 6771 - 6789
  • [47] On the Information-Theoretical Meaning of Hill's Parametric Evenness
    C. Ricotta
    G.C. Avena
    Acta Biotheoretica, 2002, 50 : 63 - 71
  • [48] INFORMATION-THEORETICAL ANALYSIS OF NUCLEIC-ACID SEQUENCES
    DARIUS, P
    GROOTAERS, JL
    ARCHIVES INTERNATIONALES DE PHYSIOLOGIE DE BIOCHIMIE ET DE BIOPHYSIQUE, 1979, 87 (05): : 1047 - 1048
  • [49] Information-Theoretical Analysis of Two Shannon's Ciphers
    Ryabko, Boris
    2016 XV INTERNATIONAL SYMPOSIUM PROBLEMS OF REDUNDANCY IN INFORMATION AND CONTROL SYSTEMS (REDUNDANCY), 2016, : 129 - 131
  • [50] Information-theoretical meaning of quantum-dynamical entropy
    Alicki, R
    PHYSICAL REVIEW A, 2002, 66 (05): : 6