LDS-Inspired Residual Networks

被引:13
|
作者
Dimou, Anastasios [1 ,2 ]
Ataloglou, Dimitrios [2 ]
Dimitropoulos, Kosmas [2 ]
Alvarez, Federico [1 ]
Daras, Petros [2 ]
机构
[1] Univ Politecn Madrid, Senales Sistemas & Radiocomunicac, Madrid 28040, Spain
[2] Ctr Res & Technol Hellas, Informat Technol Inst, Thessaloniki 57001, Greece
基金
欧盟地平线“2020”;
关键词
ResNet; linear dynamical systems; convolutional neural networks; image classification; object detection; VIDEO;
D O I
10.1109/TCSVT.2018.2869680
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Residual networks (ResNets) have introduced a milestone for the deep learning community due to their outstanding performance in diverse applications. They enable efficient training of increasingly deep networks, reducing the training difficulty and error. The main intuition behind them is that, instead of mapping the input information, they are mapping a residual part of it. Since the original work, a lot of extensions have been proposed to improve information mapping. In this paper, a novel extension of the residual block is proposed inspired by linear dynamical systems (LDSs), called LDS-ResNet. Specifically, a new module is presented that improves mapping of residual information by transforming it in a hidden state and then mapping it back to the desired feature space using convolutional layers. The proposed module is utilized to construct multi-branch residual blocks for convolutional neural networks. An exploration of possible architectural choices is presented and evaluated. Experimental results show that LDS-ResNet outperforms the original ResNet in image classification and object detection tasks on public datasets such as CIFAR-10/100, ImageNet, VOC, and MOT2017. Moreover, its performance boost is complementary to other extensions of the original network such as pre-activation and bottleneck, as well as stochastic training and Squeeze-Excitation.
引用
收藏
页码:2363 / 2375
页数:13
相关论文
共 50 条
  • [1] Residual Networks of Residual Networks: Multilevel Residual Networks
    Zhang, Ke
    Sun, Miao
    Han, Tony X.
    Yuan, Xingfang
    Guo, Liru
    Liu, Tao
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2018, 28 (06) : 1303 - 1314
  • [2] Optimization Method of Residual Networks of Residual Networks for Image Classification
    Zhang, Ke
    Guo, Liru
    Gao, Ce
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2018, : 321 - 325
  • [3] Optimization Method of Residual Networks of Residual Networks for Image Classification
    Lin, Long
    Yuan, Hao
    Guo, Liru
    Kuang, Yingqun
    Zhang, Ke
    INTELLIGENT COMPUTING METHODOLOGIES, ICIC 2018, PT III, 2018, 10956 : 212 - 222
  • [4] CONCENTRATION OF AN INSPIRED FOREIGN GAS IN RESIDUAL VOLUME
    LEWIS, BM
    FEDERATION PROCEEDINGS, 1971, 30 (02) : A218 - &
  • [5] Deep Residual Networks of Residual Networks for Image Super-Resolution
    Wei, Xueqi
    Yang, Fumeng
    Wu, Congzhong
    LIDAR IMAGING DETECTION AND TARGET RECOGNITION 2017, 2017, 10605
  • [6] Domino inspired MOBILE networks
    Nunez, J.
    Avedillo, M. J.
    Quintana, J. M.
    ELECTRONICS LETTERS, 2012, 48 (05) : 292 - U1596
  • [7] Wide deep residual networks in networks
    Hmidi Alaeddine
    Malek Jihene
    Multimedia Tools and Applications, 2023, 82 : 7889 - 7899
  • [8] Wide deep residual networks in networks
    Alaeddine, Hmidi
    Jihene, Malek
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (05) : 7889 - 7899
  • [9] Invertible Residual Networks
    Behrmann, Jens
    Grathwohl, Will
    Chen, Ricky T. Q.
    Duvenaud, David
    Jacobsen, Joern-Henrik
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [10] Residual Alignment: Uncovering the Mechanisms of Residual Networks
    Li, Jianing
    Papyan, Vardan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,