Energy-Efficient Neural Information Processing in Individual Neurons and Neuronal Networks

被引:71
|
作者
Yu, Lianchun [1 ]
Yu, Yuguo [2 ,3 ,4 ]
机构
[1] Lanzhou Univ, Inst Theoret Phys, Key Lab Magnetism & Magnet Mat, Minist Educ, Lanzhou, Peoples R China
[2] Fudan Univ, Sch Life Sci, Shanghai 200433, Peoples R China
[3] Fudan Univ, State Key Lab Med Neurobiol, Inst Brain Sci, Shanghai 200433, Peoples R China
[4] Fudan Univ, Collaborat Innovat Ctr Brain Sci, Shanghai 200433, Peoples R China
基金
中国国家自然科学基金;
关键词
energy efficiency; information processing; metabolic energy cost; evolution; excitation/inhibition balance; sparse coding; SMALL-WORLD NETWORKS; CORTICAL NETWORKS; ACTION-POTENTIALS; CHANNEL NOISE; BRAIN; INHIBITION; COMPUTATION; DYNAMICS; TEMPERATURE; EXCITATION;
D O I
10.1002/jnr.24131
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. (C) 2017 Wiley Periodicals, Inc.
引用
收藏
页码:2253 / 2266
页数:14
相关论文
共 50 条
  • [21] Energy-efficient skycube query processing in wireless sensor networks
    Sino-Dutch Biomedical and Information Engineering School, Northeastern University, Shenyang, 110819, China
    Telkomnika Indonesian J. Elect. Eng., 2013, 10 (6240-6249):
  • [22] An energy-efficient query processing algorithm for wireless sensor networks
    Sun, Jun-Zhao
    UBIQUITOUS INTELLIGENCE AND COMPUTING, PROCEEDINGS, 2008, 5061 : 373 - 385
  • [23] Energy-Efficient Neural Networks using Approximate Computation Reuse
    Jiao, Xun
    Akhlaghi, Vahideh
    Jiang, Yu
    Gupta, Rajesh K.
    PROCEEDINGS OF THE 2018 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2018, : 1223 - 1228
  • [24] Dynamic Spike Bundling for Energy-Efficient Spiking Neural Networks
    Krithivasan, Sarada
    Sen, Sanchari
    Venkataramani, Swagath
    Raghunathan, Anand
    2019 IEEE/ACM INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED), 2019,
  • [25] Approximate Softmax Functions for Energy-Efficient Deep Neural Networks
    Chen, Ke
    Gao, Yue
    Waris, Haroon
    Liu, Weiqiang
    Lombardi, Fabrizio
    IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2023, 31 (01) : 4 - 16
  • [26] SHARP: An Adaptable, Energy-Efficient Accelerator for Recurrent Neural Networks
    Aminabadi, Reza Yazdani
    Ruwase, Olatunji
    Zhang, Minjia
    He, Yuxiong
    Arnau, Jose-Maria
    Gonzalez, Antonio
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (02)
  • [27] Towards Energy-Efficient Sentiment Classification with Spiking Neural Networks
    Chen, Junhao
    Ye, Xiaojun
    Sun, Jingbo
    Li, Chao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 518 - 529
  • [28] ENERGY-EFFICIENT REPROGRAMMING IN WSN USING CONSTRUCTIVE NEURAL NETWORKS
    Urda Munoz, Daniel
    Canete Carmona, Eduardo
    Subirats Contreras, Jose Luis
    Franco, Leonardo
    Llopis Torres, Luis Manuel
    Jerez Aragones, Jose Manuel
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2012, 8 (11): : 7561 - 7578
  • [29] LOGNET: ENERGY-EFFICIENT NEURAL NETWORKS USING LOGARITHMIC COMPUTATION
    Lee, Edward H.
    Miyashita, Daisuke
    Chai, Elaina
    Murmann, Boris
    Wong, S. Simon
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5900 - 5904
  • [30] SRAM Voltage Scaling for Energy-Efficient Convolutional Neural Networks
    Yang, Lita
    Murmann, Boris
    PROCEEDINGS OF THE EIGHTEENTH INTERNATIONAL SYMPOSIUM ON QUALITY ELECTRONIC DESIGN (ISQED), 2017, : 7 - 12