Structured information in sparse-code metric neural networks

被引:9
|
作者
Dominguez, David [1 ]
Gonzalez, Mario [1 ]
Rodriguez, Francisco B. [1 ]
Serrano, Eduardo [1 ]
Erichsen, R., Jr. [2 ]
Theumann, W. K. [2 ]
机构
[1] Univ Autonoma Madrid, EPS, Dept Ingn Informat, E-28049 Madrid, Spain
[2] Univ Fed Rio Grande do Sul, Inst Fis, BR-91501970 Porto Alegre, RS, Brazil
关键词
Associative memory; Network topology; Threshold dynamics; Structured information; Small world; WORLD; ATTRACTOR; NEURONS;
D O I
10.1016/j.physa.2011.09.002
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Sparse-code networks have retrieval abilities which are strongly dependent on the firing threshold for the neurons. If the connections are spatially uniform, the macroscopic properties of the network can be measured by the overlap between neurons and learned patterns, and by the global activity. However, for nonuniform networks, for instance small-world networks, the neurons can retrieve fragments of patterns without performing global retrieval. Local overlaps are needed to describe the network. We characterize the structure type of the neural states using a parameter that is related to fluctuations of the local overlaps, with distinction between bump and block phases. Simulation of neural dynamics shows a competition between localized (bump), structured (block) and global retrieval. When the network topology randomness increases, the phase-diagram shows a transition from local to global retrieval. Furthermore, the local phase splits into a bump phase for low activity and a block phase for high activity. A theoretical approach solves the asymptotic limit of the model, and confirms the simulation results which predicts the change of stability from bumps to blocks when the storage ratio increases. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:799 / 808
页数:10
相关论文
共 50 条
  • [1] Sparse-code Muscle Representation for Human Walking
    Iyer, Rahul
    Ballard, Dana
    2010 3RD IEEE RAS AND EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS, 2010, : 491 - 497
  • [2] Interleaved Structured Sparse Convolutional Neural Networks
    Xie, Guotian
    Wang, Jingdong
    Zhang, Ting
    Lai, Jianhuang
    Hong, Richang
    Qi, Guo-Jun
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 8847 - 8856
  • [3] A LARGE-SCALE EXTENSION OF SPARSE-CODE MULTIPLE-ACCESS SYSTEM
    Yang, Chao
    Jing, Shusen
    Liang, Xiao
    Zhang, Zaichen
    You, Xiaohu
    Zhang, Chuan
    2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 848 - 852
  • [4] SPARSE CODING AND INFORMATION IN HEBBIAN NEURAL NETWORKS
    PEREZVICENTE, CJ
    EUROPHYSICS LETTERS, 1989, 10 (07): : 621 - 625
  • [5] Sparse bursts optimize information transmission in a multiplexed neural code
    Naud, Richard
    Sprekeler, Henning
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018, 115 (27) : E6329 - E6338
  • [6] Compression of Deep Neural Networks with Structured Sparse Ternary Coding
    Boo, Yoonho
    Sung, Wonyong
    JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2019, 91 (09): : 1009 - 1019
  • [7] Compression of Deep Neural Networks with Structured Sparse Ternary Coding
    Yoonho Boo
    Wonyong Sung
    Journal of Signal Processing Systems, 2019, 91 : 1009 - 1019
  • [8] A Model for Structured Information Representation in Neural Networks of the Brain
    Muller, Michael G.
    Papadimitriou, Christos H.
    Maass, Wolfgang
    Legenstein, Robert
    ENEURO, 2020, 7 (03)
  • [9] MARTINGALE APPROACH TO NEURAL NETWORKS WITH HIERARCHICALLY STRUCTURED INFORMATION
    BOS, S
    KUHN, R
    VANHEMMEN, JL
    ZEITSCHRIFT FUR PHYSIK B-CONDENSED MATTER, 1988, 71 (02): : 261 - 271
  • [10] Structured information in small-world neural networks
    Dominguez, David
    Gonzalez, Mario
    Serrano, Eduardo
    Rodriguez, Francisco B.
    PHYSICAL REVIEW E, 2009, 79 (02):