Deep Neural Networks with Efficient Guaranteed Invariances

被引:0
|
作者
Rath, Matthias [1 ,2 ]
Condurache, Alexandru Paul [1 ,2 ]
机构
[1] Robert Bosch GmbH, Cross Domain Comp Solut, Stuttgart, Germany
[2] Univ Lubeck, Inst Signal Proc, Lubeck, Germany
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206 | 2023年 / 206卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We address the problem of improving the performance and in particular the sample complexity of deep neural networks by enforcing and guaranteeing invariances to symmetry transformations rather than learning them from data. Group-equivariant convolutions are a popular approach to obtain equivariant representations. The desired corresponding invariance is then imposed using pooling operations. For rotations, it has been shown that using invariant integration instead of pooling further improves the sample complexity. In this contribution, we first expand invariant integration beyond rotations to flips and scale transformations. We then address the problem of incorporating multiple desired invariances into a single network. For this purpose, we propose a multi-stream architecture, where each stream is invariant to a different transformation such that the network can simultaneously benefit from multiple invariances. We demonstrate our approach with successful experiments on Scaled-MNIST, SVHN, CIFAR-10 and STL-10.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Do Invariances in Deep Neural Networks Align with Human Perception?
    Nanda, Vedant
    Majumdar, Ayan
    Kolling, Camila
    Dickerson, John P.
    Gummadi, Krishna P.
    Love, Bradley C.
    Weller, Adrian
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9277 - 9285
  • [2] Learning Invariances in Neural Networks
    Benton, Gregory
    Finzi, Marc
    Izmailov, Pavel
    Wilson, Andrew Gordon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] Diverse Feature Visualizations Reveal Invariances in Early Layers of Deep Neural Networks
    Cadena, Santiago A.
    Weis, Marissa A.
    Gatys, Leon A.
    Bethge, Matthias
    Ecker, Alexander S.
    COMPUTER VISION - ECCV 2018, PT XII, 2018, 11216 : 225 - 240
  • [4] Encoding Involutory Invariances in Neural Networks
    Bhattacharya, Anwesh
    Mattheakis, Marios
    Protopapas, Pavlos
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [5] A Study of Transformation-invariances of Deep Belief Networks
    Shou, Zheng
    Zhang, Yuhao
    Cai, H. J.
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [6] An Efficient Accelerator for Deep Convolutional Neural Networks
    Kuo, Yi-Xian
    Lai, Yeong-Kang
    2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN), 2020,
  • [7] Efficient Deep Neural Networks for Edge Computing
    Alnemari, Mohammed
    Bagherzadeh, Nader
    2019 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING (IEEE EDGE), 2019, : 1 - 7
  • [8] Efficient Model Averaging for Deep Neural Networks
    Opitz, Michael
    Possegger, Horst
    Bischof, Horst
    COMPUTER VISION - ACCV 2016, PT II, 2017, 10112 : 205 - 220
  • [9] Bit Efficient Quantization for Deep Neural Networks
    Nayak, Prateeth
    Zhang, David
    Chai, Sek
    FIFTH WORKSHOP ON ENERGY EFFICIENT MACHINE LEARNING AND COGNITIVE COMPUTING - NEURIPS EDITION (EMC2-NIPS 2019), 2019, : 52 - 56
  • [10] The Efficient Hedging Frontier with Deep Neural Networks
    Gong, Zheng
    Ventre, Carmine
    O'Hara, John
    ICAIF 2021: THE SECOND ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, 2021,