Energy-Efficient Neural Information Processing in Individual Neurons and Neuronal Networks

被引:71
|
作者
Yu, Lianchun [1 ]
Yu, Yuguo [2 ,3 ,4 ]
机构
[1] Lanzhou Univ, Inst Theoret Phys, Key Lab Magnetism & Magnet Mat, Minist Educ, Lanzhou, Peoples R China
[2] Fudan Univ, Sch Life Sci, Shanghai 200433, Peoples R China
[3] Fudan Univ, State Key Lab Med Neurobiol, Inst Brain Sci, Shanghai 200433, Peoples R China
[4] Fudan Univ, Collaborat Innovat Ctr Brain Sci, Shanghai 200433, Peoples R China
基金
中国国家自然科学基金;
关键词
energy efficiency; information processing; metabolic energy cost; evolution; excitation/inhibition balance; sparse coding; SMALL-WORLD NETWORKS; CORTICAL NETWORKS; ACTION-POTENTIALS; CHANNEL NOISE; BRAIN; INHIBITION; COMPUTATION; DYNAMICS; TEMPERATURE; EXCITATION;
D O I
10.1002/jnr.24131
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. (C) 2017 Wiley Periodicals, Inc.
引用
收藏
页码:2253 / 2266
页数:14
相关论文
共 50 条
  • [1] Information Transfer by Energy-Efficient Neurons
    Berger, Toby
    Levy, William B.
    2009 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, VOLS 1- 4, 2009, : 1584 - +
  • [2] E-PUR: An Energy-Efficient Processing Unit for Recurrent Neural Networks
    Silfa, Franyell
    Dot, Gem
    Arnau, Jose-Maria
    Gonzalez, Antonio
    27TH INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES (PACT 2018), 2018,
  • [3] Energy-Efficient Image Processing Using Binary Neural Networks with Hadamard Transform
    Park, Jaeyoon
    Lee, Sunggu
    COMPUTER VISION - ACCV 2022, PT V, 2023, 13845 : 512 - 526
  • [4] Energy-Efficient Convolutional Neural Networks with Deterministic Bit-Stream Processing
    Faraji, S. Rasoul
    Najafi, M. Hassan
    Li, Bingzhe
    Lilja, David J.
    Bazargan, Kia
    2019 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2019, : 1757 - 1762
  • [5] Neural Dynamics Pruning for Energy-Efficient Spiking Neural Networks
    Huang, Haoyu
    He, Linxuan
    Liu, Faqiang
    Zhao, Rong
    Shi, Luping
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,
  • [6] BitSNNs: Revisiting Energy-Efficient Spiking Neural Networks
    Hu, Yangfan
    Zheng, Qian
    Pan, Gang
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) : 1736 - 1747
  • [7] AutoSNN: Towards Energy-Efficient Spiking Neural Networks
    Na, Byunggook
    Mok, Jisoo
    Park, Seongsik
    Lee, Dongjin
    Choe, Hyeokjun
    Yoon, Sungroh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [8] DRMap: A Generic DRAM Data Mapping Policy for Energy-Efficient Processing of Convolutional Neural Networks
    Putra, Rachmad Vidya Wicaksana
    Hanif, Muhammad Abdullah
    Shafique, Muhammad
    PROCEEDINGS OF THE 2020 57TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2020,
  • [9] Beyond-local neural information processing in neuronal networks
    Balkenhol, Johannes
    Haendel, Barbara
    Biswas, Sounak
    Grohmann, Johannes
    Kistowski, Joakim, V
    Prada, Juan
    Bosman, Conrado A.
    Ehrenreich, Hannelore
    Wojcik, Sonja M.
    Kounev, Samuel
    Blum, Robert
    Dandekar, Thomas
    COMPUTATIONAL AND STRUCTURAL BIOTECHNOLOGY JOURNAL, 2024, 23 : 4288 - 4305
  • [10] Energy-Efficient Design of Processing Element for Convolutional Neural Network
    Choi, Yeongjae
    Bae, Dongmyung
    Sim, Jaehyeong
    Choi, Seungkyu
    Kim, Minhye
    Kim, Lee-Sup
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2017, 64 (11) : 1332 - 1336