Orthogonal Transforms For Learning Invariant Representations In Equivariant Neural Networks

被引:1
|
作者
Singh, Jaspreet [1 ]
Singh, Chandan [1 ]
Rana, Ankur [1 ]
机构
[1] Punjabi Univ, Patiala, Punjab, India
关键词
D O I
10.1109/WACV56688.2023.00157
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The convolutional layers of the standard convolutional neural networks (CNNs) are equivariant to translation. Recently, a new class of CNNs is introduced which is equivariant to other affine geometric transformations such as rotation and reflection by replacing the standard convolutional layer with the group convolutional layer or using the steerable filters in the convloutional layer. We propose to embed the 2D positional encoding which is invariant to rotation, reflection and translation using orthogonal polar harmonic transforms (PHTs) before flattening the feature maps for fully-connected or classification layer in the equivariant CNN architecture. We select the PHTs among several invariant transforms, as they are very efficient in performance and speed. The proposed 2D positional encoding scheme between the convolutional and fully-connected layers of the equivariant networks is shown to provide significant improvement in performance on the rotated MNIST, CIFAR-10 and CIFAR-100 datasets.
引用
收藏
页码:1523 / 1530
页数:8
相关论文
共 50 条
  • [41] Equivariant Neural Networks for Indirect Measurements
    Beckmann, Matthias
    Heilenkotter, Nick
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2024, 6 (03): : 579 - 601
  • [42] Theory for Equivariant Quantum Neural Networks
    Nguyen, Quynh T.
    Schatzki, Louis
    Braccia, Paolo
    Ragone, Michael
    Coles, Patrick J.
    Sauvage, Frederic
    Larocca, Martin
    Cerezo, M.
    PRX QUANTUM, 2024, 5 (02):
  • [43] Learning Atomic Multipoles: Prediction of the Electrostatic Potential with Equivariant Graph Neural Networks
    Thurlemann, Moritz
    Boselt, Lennard
    Riniker, Sereina
    JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 2022, 18 (03) : 1701 - 1710
  • [44] Orthogonal Super Greedy Learning for Sparse Feedforward Neural Networks
    Xu, Lin
    Cao, Xiangyong
    Yao, Jing
    Yan, Zheng
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (01): : 161 - 170
  • [45] Variable Assignment Invariant Neural Networks for Learning Logic Programs
    Phua, Yin Jun
    Inoue, Katsumi
    NEURAL-SYMBOLIC LEARNING AND REASONING, PT I, NESY 2024, 2024, 14979 : 47 - 61
  • [46] A LEARNING MECHANISM FOR INVARIANT PATTERN-RECOGNITION IN NEURAL NETWORKS
    COOLEN, ACC
    KUIJK, FW
    NEURAL NETWORKS, 1989, 2 (06) : 495 - 506
  • [47] LEARNING FLIPPING AND ROTATION INVARIANT SPARSIFYING TRANSFORMS
    Wen, Bihan
    Ravishankar, Saiprasad
    Bresler, Yoram
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 3857 - 3861
  • [48] Discriminative Template Learning in Group-Convolutional Networks for Invariant Speech Representations
    Zhang, Chiyuan
    Voinea, Stephen
    Evangelopoulos, Georgios
    Rosasco, Lorenzo
    Poggio, Tomaso
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 3229 - 3233
  • [49] Learning Transformation-Invariant Representations for Image Recognition With Drop Transformation Networks
    Fan, Chunxiao
    Li, Yang
    Wang, Guijin
    Li, Yong
    IEEE ACCESS, 2018, 6 : 73357 - 73369
  • [50] Learning SO(3) Equivariant Representations with Spherical CNNs
    Carlos Esteves
    Christine Allen-Blanchette
    Ameesh Makadia
    Kostas Daniilidis
    International Journal of Computer Vision, 2020, 128 : 588 - 600