Learning to Act through Evolution of Neural Diversity in Random Neural Networks

被引:2
|
作者
Pedersen, Joachim Winther [1 ]
Risi, Sebastian [1 ]
机构
[1] IT Univ Copenhagen, Copenhagen, Denmark
关键词
DYNAMICS; PLASTICITY;
D O I
10.1145/3583131.3590460
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Biological nervous systems consist of networks of diverse, sophisticated information processors in the form of neurons of different classes. In most artificial neural networks (ANNs), neural computation is abstracted to an activation function that is usually shared between all neurons within a layer or even the whole network; training of ANNs focuses on synaptic optimization. In this paper, we propose the optimization of neuro-centric parameters to attain a set of diverse neurons that can perform complex computations. Demonstrating the promise of the approach, we show that evolving neural parameters alone allows agents to solve various reinforcement learning tasks without optimizing any synaptic weights. While not aiming to be an accurate biological model, parameterizing neurons to a larger degree than the current common practice, allows us to ask questions about the computational abilities afforded by neural diversity in random neural networks. The presented results open up interesting future research directions, such as combining evolved neural diversity with activity-dependent plasticity.
引用
收藏
页码:1248 / 1256
页数:9
相关论文
共 50 条
  • [41] Random Feature Amplification: Feature Learning and Generalization in Neural Networks
    Frei, Spencer
    Chatterji, Niladri S.
    Bartlett, Peter L.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [42] ASSOCIATIVE LEARNING IN RANDOM-ENVIRONMENTS USING NEURAL NETWORKS
    NARENDRA, KS
    MUKHOPADHYAY, S
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (01): : 20 - 31
  • [43] RALR: Random Amplify Learning Rates for Training Neural Networks
    Deng, Jiali
    Gong, Haigang
    Liu, Minghui
    Xie, Tianshu
    Cheng, Xuan
    Wang, Xiaomin
    Liu, Ming
    APPLIED SCIENCES-BASEL, 2022, 12 (01):
  • [44] Model and Feature Diversity for Bayesian Neural Networks in Mutual Learning
    Cuong Pham
    Nguyen, Cuong C.
    Trung Le
    Dinh Phung
    Carneiro, Gustavo
    Thanh-Toan Do
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [45] Extended random neural networks
    Martinelli, G
    Mascioli, FMF
    Panella, M
    Rizzi, A
    NEURAL NETS, 2002, 2486 : 75 - 82
  • [46] Random approximants and neural networks
    Makovoz, Y
    JOURNAL OF APPROXIMATION THEORY, 1996, 85 (01) : 98 - 109
  • [47] STABLE RANDOM NEURAL NETWORKS
    GELENBE, E
    COMPTES RENDUS DE L ACADEMIE DES SCIENCES SERIE II, 1990, 310 (03): : 177 - 180
  • [48] RANDOM MATRICES AND NEURAL NETWORKS
    LUCA, AD
    RICCIARD.LM
    VASUDEVA.R
    KYBERNETIK, 1970, 6 (05): : 163 - +
  • [49] CHAOS IN RANDOM NEURAL NETWORKS
    SOMPOLINSKY, H
    CRISANTI, A
    SOMMERS, HJ
    PHYSICAL REVIEW LETTERS, 1988, 61 (03) : 259 - 262
  • [50] Improving learning in neural networks through weight initializations
    Mittal, Apeksha
    Singh, Amit Prakash
    Chandra, Pravin
    JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES, 2021, 42 (05): : 951 - 971