Orthogonal Transforms For Learning Invariant Representations In Equivariant Neural Networks

被引:1
|
作者
Singh, Jaspreet [1 ]
Singh, Chandan [1 ]
Rana, Ankur [1 ]
机构
[1] Punjabi Univ, Patiala, Punjab, India
关键词
D O I
10.1109/WACV56688.2023.00157
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The convolutional layers of the standard convolutional neural networks (CNNs) are equivariant to translation. Recently, a new class of CNNs is introduced which is equivariant to other affine geometric transformations such as rotation and reflection by replacing the standard convolutional layer with the group convolutional layer or using the steerable filters in the convloutional layer. We propose to embed the 2D positional encoding which is invariant to rotation, reflection and translation using orthogonal polar harmonic transforms (PHTs) before flattening the feature maps for fully-connected or classification layer in the equivariant CNN architecture. We select the PHTs among several invariant transforms, as they are very efficient in performance and speed. The proposed 2D positional encoding scheme between the convolutional and fully-connected layers of the equivariant networks is shown to provide significant improvement in performance on the rotated MNIST, CIFAR-10 and CIFAR-100 datasets.
引用
收藏
页码:1523 / 1530
页数:8
相关论文
共 50 条
  • [31] A NEW ALGORITHM FOR LEARNING REPRESENTATIONS IN BOOLEAN NEURAL NETWORKS
    BISWAS, NN
    KUMAR, R
    CURRENT SCIENCE, 1990, 59 (12): : 595 - 600
  • [32] Rep the Set: Neural Networks for Learning Set Representations
    Skianis, Konstantinos
    Nikolentzos, Giannis
    Limnios, Stratis
    Vazirgiannis, Michalis
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [33] Learning Transferable Feature Representations Using Neural Networks
    Bhatt, Himanshu S.
    Roy, Shourya
    Rajkumar, Arun
    Ramakrishnan, Sriranjani
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 4124 - 4134
  • [34] Residual Recurrent Neural Networks for Learning Sequential Representations
    Yue, Boxuan
    Fu, Junwei
    Liang, Jun
    INFORMATION, 2018, 9 (03)
  • [35] Learning Reliable Neural Networks with Distributed Architecture Representations
    Li, Yinqiao
    Cao, Runzhe
    He, Qiaozhi
    Xiao, Tong
    Zhu, Jingbo
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (04)
  • [36] Learning Word Representations with Deep Neural Networks for Turkish
    Dundar, Enes Burak
    Alpaydin, Ethem
    2019 27TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2019,
  • [37] Learning representations in Bayesian Confidence Propagation neural networks
    Ravichandran, Naresh Balaji
    Lansner, Anders
    Herman, Pawel
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [38] Unsupervised learning of invariant representations
    Anselmi, Fabio
    Leibo, Joel Z.
    Rosasco, Lorenzo
    Mutch, Jim
    Tacchetti, Andrea
    Poggio, Tomaso
    THEORETICAL COMPUTER SCIENCE, 2016, 633 : 112 - 121
  • [39] Clifford Group Equivariant Neural Networks
    Ruhe, David
    Brandstetter, Johannes
    Forre, Patrick
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [40] Equivariant neural networks for inverse problems
    Celledoni, Elena
    Ehrhardt, Matthias J.
    Etmann, Christian
    Owren, Brynjulf
    Schonlieb, Carola-Bibiane
    Sherry, Ferdia
    INVERSE PROBLEMS, 2021, 37 (08)