Orthogonal Transforms For Learning Invariant Representations In Equivariant Neural Networks

被引:1
|
作者
Singh, Jaspreet [1 ]
Singh, Chandan [1 ]
Rana, Ankur [1 ]
机构
[1] Punjabi Univ, Patiala, Punjab, India
关键词
D O I
10.1109/WACV56688.2023.00157
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The convolutional layers of the standard convolutional neural networks (CNNs) are equivariant to translation. Recently, a new class of CNNs is introduced which is equivariant to other affine geometric transformations such as rotation and reflection by replacing the standard convolutional layer with the group convolutional layer or using the steerable filters in the convloutional layer. We propose to embed the 2D positional encoding which is invariant to rotation, reflection and translation using orthogonal polar harmonic transforms (PHTs) before flattening the feature maps for fully-connected or classification layer in the equivariant CNN architecture. We select the PHTs among several invariant transforms, as they are very efficient in performance and speed. The proposed 2D positional encoding scheme between the convolutional and fully-connected layers of the equivariant networks is shown to provide significant improvement in performance on the rotated MNIST, CIFAR-10 and CIFAR-100 datasets.
引用
收藏
页码:1523 / 1530
页数:8
相关论文
共 50 条
  • [21] Classification of rotation-invariant biomedical images using equivariant neural networks
    Bernander, Karl Bengtsson
    Sintorn, Ida-Maria
    Strand, Robin
    Nystrom, Ingela
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [22] Fractal learning of fast orthogonal neural networks
    A. Yu. Dorogov
    Optical Memory and Neural Networks, 2012, 21 (2) : 105 - 118
  • [23] Learning Rotation-Invariant Representations of Point Clouds Using Aligned Edge Convolutional Neural Networks
    Zhang, Junming
    Yu, Ming-Yuan
    Vasudevan, Ram
    Johnson-Roberson, Matthew
    2020 INTERNATIONAL CONFERENCE ON 3D VISION (3DV 2020), 2020, : 200 - 209
  • [24] Deep Neural Networks for Learning Graph Representations
    Cao, Shaosheng
    Lu, Wei
    Xu, Qiongkai
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1145 - 1152
  • [25] Learning flat representations with artificial neural networks
    Vlad Constantinescu
    Costin Chiru
    Tudor Boloni
    Adina Florea
    Robi Tacutu
    Applied Intelligence, 2021, 51 : 2456 - 2470
  • [26] Learning flat representations with artificial neural networks
    Constantinescu, Vlad
    Chiru, Costin
    Boloni, Tudor
    Florea, Adina
    Tacutu, Robi
    APPLIED INTELLIGENCE, 2021, 51 (04) : 2456 - 2470
  • [27] Equivariant Wavelets: Fast Rotation and Translation Invariant Wavelet Scattering Transforms
    Saydjari, Andrew K.
    Finkbeiner, Douglas P.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (02) : 1716 - 1731
  • [28] Equivariant Hypergraph Neural Networks
    Kim, Jinwoo
    Oh, Saeyoon
    Cho, Sungjun
    Hong, Seunghoon
    COMPUTER VISION, ECCV 2022, PT XXI, 2022, 13681 : 86 - 103
  • [29] INVARIANT PATTERN-RECOGNITION USING FOURIER-MELLIN TRANSFORMS AND NEURAL NETWORKS
    SHENG, YL
    LEJEUNE, C
    JOURNAL OF OPTICS-NOUVELLE REVUE D OPTIQUE, 1991, 22 (05): : 223 - 228
  • [30] Learning of Process Representations Using Recurrent Neural Networks
    Seeliger, Alexander
    Luettgen, Stefan
    Nolle, Timo
    Muehlhaeuser, Max
    ADVANCED INFORMATION SYSTEMS ENGINEERING (CAISE 2021), 2021, 12751 : 109 - 124