Orthogonal Transforms in Neural Networks Amount to Effective Regularization

被引:0
|
作者
Zajac, Krzysztof [1 ]
Sopot, Wojciech [1 ]
Wachel, Pawel [1 ]
机构
[1] Wroclaw Univ Sci & Technol, Fac Informat & Commun Technol, Wroclaw, Poland
关键词
Neural Networks; Nonlinear Dynamics; Orthogonal Transform; System Identification; FAST FOURIER-TRANSFORM; IDENTIFICATION; USER;
D O I
10.1007/978-3-031-61857-4_33
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider applications of neural networks in nonlinear system identification and formulate a hypothesis that adjusting general network structure by incorporating frequency information or other known orthogonal transform, should result in an efficient neural network retaining its universal properties. We show that such a structure is a universal approximator and that using any orthogonal transform in a proposed way implies regularization during training by adjusting the learning rate of each parameter individually. We empirically show in particular, that such a structure, using the Fourier transform, outperforms equivalent models without orthogonality support.
引用
收藏
页码:337 / 348
页数:12
相关论文
共 50 条
  • [31] The Coupling Effect of Lipschitz Regularization in Neural Networks
    Couellan N.
    SN Computer Science, 2021, 2 (2)
  • [32] Noisin: Unbiased Regularization for Recurrent Neural Networks
    Dieng, Adji B.
    Ranganath, Rajesh
    Altosaar, Jaan
    Blei, David M.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [33] DropWeak: A novel regularization method of neural networks
    El Korchi, Anas
    Ghanou, Youssf
    PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING IN DATA SCIENCES (ICDS2017), 2018, 127 : 102 - 108
  • [34] Regularization of deep neural networks with spectral dropout
    Khan, Salman H.
    Hayat, Munawar
    Porikli, Fatih
    NEURAL NETWORKS, 2019, 110 : 82 - 90
  • [35] Sparse synthesis regularization with deep neural networks
    Obmann, Daniel
    Schwab, Johannes
    Haltmeier, Markus
    2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [36] Regularization parameter estimation for feedforward neural networks
    Guo, P
    Lyu, MR
    Chen, CLP
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2003, 33 (01): : 35 - 44
  • [37] Group sparse regularization for deep neural networks
    Scardapane, Simone
    Comminiello, Danilo
    Hussain, Amir
    Uncini, Aurelio
    NEUROCOMPUTING, 2017, 241 : 81 - 89
  • [38] Rethinking Graph Regularization for Graph Neural Networks
    Yang, Han
    Ma, Kaili
    Cheng, James
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4573 - 4581
  • [39] LMix: regularization strategy for convolutional neural networks
    Yan, Linyu
    Zheng, Kunpeng
    Xia, Jinyao
    Li, Ke
    Ling, Hefei
    SIGNAL IMAGE AND VIDEO PROCESSING, 2023, 17 (04) : 1245 - 1253
  • [40] A Comparison of Regularization Techniques in Deep Neural Networks
    Nusrat, Ismoilov
    Jang, Sung-Bong
    SYMMETRY-BASEL, 2018, 10 (11):