Spectral Batch Normalization: Normalization in the Frequency Domain

被引:0
|
作者
Cakaj, Rinor [1 ,2 ]
Mehnert, Jens [1 ]
Yang, Bin [3 ]
机构
[1] Robert Bosch GmbH, Signal Proc, D-71229 Leonberg, Germany
[2] Univ Stuttgart, D-71229 Leonberg, Germany
[3] Univ Stuttgart, ISS, D-70550 Stuttgart, Germany
来源
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN | 2023年
关键词
NEURAL-NETWORKS;
D O I
10.1109/IJCNN54540.2023.10191931
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Regularization is a set of techniques that are used to improve the generalization ability of deep neural networks. In this paper, we introduce spectral batch normalization (SBN), a novel effective method to improve generalization by normalizing feature maps in the frequency (spectral) domain. The activations of residual networks without batch normalization (BN) tend to explode exponentially in the depth of the network at initialization. This leads to extremely large feature map norms even though the parameters are relatively small. These explosive dynamics can be very detrimental to learning. BN makes weight decay regularization on the scaling factors gamma, beta approximately equivalent to an additive penalty on the norm of the feature maps, which prevents extremely large feature map norms to a certain degree. It was previously shown that preventing explosive growth at the final layer at initialization and during training in ResNets can recover a large part of Batch Normalization's generalization boost. However, we show experimentally that, despite the approximate additive penalty of BN, feature maps in deep neural networks (DNNs) tend to explode at the beginning of the training and that feature maps of DNNs contain large values during the whole training. This phenomenon also occurs in a weakened form in non-residual networks. Intuitively, it is not preferred to have large values in feature maps since they have too much influence on the prediction in contrast to other parts of the feature map. SBN addresses large feature maps by normalizing them in the frequency domain. In our experiments, we empirically show that SBN prevents exploding feature maps at initialization and large feature map values during the training. Moreover, the normalization of feature maps in the frequency domain leads to more uniform distributed frequency components. This discourages the DNNs to rely on single frequency components of feature maps. These, together with other effects (e.g. noise injection, scaling and shifting of the feature map) of SBN, have a regularizing effect on the training of residual and non-residual networks. We show experimentally that using SBN in addition to standard regularization methods improves the performance of DNNs by a relevant margin, e.g. ResNet50 on CIFAR-100 by 2.31%, on ImageNet by 0.71% (from 76.80% to 77.51%) and VGG19 on CIFAR-100 by 0.66%.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] A subspace time-domain algorithm for automated NMR spectral normalization
    Lemmerling, P
    Vanhamme, L
    Romano, R
    Van Huffel, S
    JOURNAL OF MAGNETIC RESONANCE, 2002, 157 (02) : 190 - 199
  • [32] TFormer: A time-frequency Transformer with batch normalization for driver fatigue recognition
    Li, Ruilin
    Hu, Minghui
    Gao, Ruobin
    Wang, Lipo
    Suganthan, P. N.
    Sourina, Olga
    ADVANCED ENGINEERING INFORMATICS, 2024, 62
  • [33] Reciprocal normalization for domain adaptation
    Huang, Zhiyong
    Sheng, Kekai
    Li, Ke
    Liang, Jian
    Yao, Taiping
    Dong, Weiming
    Zhou, Dengwen
    Sun, Xing
    PATTERN RECOGNITION, 2023, 140
  • [34] Decompose, Adjust, Compose: Effective Normalization by Playing with Frequency for Domain Generalization
    Lee, Sangrok
    Bae, Jongseong
    Kim, Ha Young
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11776 - 11785
  • [35] Time-Domain Elastic Full Waveform Inversion With Frequency Normalization
    Fang, Jinwei
    Zhou, Hui
    Li, Yunyue Elita
    Shi, Ying
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [36] Addressing Catastrophic Forgetting by Modulating Global Batch Normalization Statistics for Medical Domain Expansion
    Gupta, Sharut
    Chang, Ken
    Qu, Liangqiong
    Rana, Aakanksha
    Ahmed, Syed Rakin
    Aggarwal, Mehak
    Arun, Nishanth
    Vaswani, Ashwin
    Raghavan, Shruti
    Agarwal, Vibha
    Gidwani, Mishka
    Hoebel, Katharina
    Patel, Jay
    Lu, Charles
    Bridge, Christopher P.
    Rubin, Daniel L.
    Kalpathy-Cramer, Jayashree
    Singh, Praveer
    ARTIFICIAL INTELLIGENCE IN PANCREATIC DISEASE DETECTION AND DIAGNOSIS, AND PERSONALIZED INCREMENTAL LEARNING IN MEDICINE, AIPAD 2024, PILM 2024, 2025, 15197 : 57 - 72
  • [37] Instance Enhancement Batch Normalization: An Adaptive Regulator of Batch Noise
    Liang, Senwei
    Huang, Zhongzhan
    Liang, Mingfu
    Yang, Haizhao
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4819 - 4827
  • [38] Frequency domain blind source separation of a reduced amount of data using frequency normalization
    Robledo-Arnuncio, Enrique
    Sawada, Hiroshi
    Makino, Shoji
    2006 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-13, 2006, : 5695 - 5698
  • [39] NORMALIZATION OF FREQUENCY DISTRIBUTIONS OF TEMPERATURE
    MADE, A
    NEUBERT, S
    ZEITSCHRIFT FUR METEOROLOGIE, 1975, 25 (01): : 17 - 20
  • [40] Delving into the Estimation Shift of Batch Normalization in a Network
    Huang, Lei
    Zhou, Yi
    Wang, Tian
    Luo, Jic
    Liu, Xianglong
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 753 - 762