Capsule Networks With Residual Pose Routing

被引:29
|
作者
Liu, Yi [1 ,2 ,3 ,4 ]
Cheng, De [5 ]
Zhang, Dingwen [6 ]
Xu, Shoukun [1 ,2 ,3 ]
Han, Jungong [7 ]
机构
[1] Changzhou Univ, Sch Comp Sci & Artificial Intelligence, Changzhou 213164, Jiangsu, Peoples R China
[2] Changzhou Univ, Aliyun Sch Big Data, Changzhou 213164, Jiangsu, Peoples R China
[3] Changzhou Univ, Sch Software, Changzhou 213164, Jiangsu, Peoples R China
[4] Changzhou Univ, CNPC CZU Innovat Alliance, Changzhou 213164, Jiangsu, Peoples R China
[5] Xidian Univ, Sch Telecommun Engn, Xian 710071, Shaanxi, Peoples R China
[6] Northwestern Polytech Univ, Sch Automat, Xian 710129, Shaanxi, Peoples R China
[7] Univ Sheffield, Dept Comp Sci, Sheffield S10 2TN, S Yorkshire, England
基金
中国国家自然科学基金;
关键词
3-D point cloud; capsule network (CapsNet); part-whole; residual routing; salient object detection;
D O I
10.1109/TNNLS.2023.3347722
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Capsule networks (CapsNets) have been known difficult to develop a deeper architecture, which is desirable for high performance in the deep learning era, due to the complex capsule routing algorithms. In this article, we present a simple yet effective capsule routing algorithm, which is presented by a residual pose routing. Specifically, the higher-layer capsule pose is achieved by an identity mapping on the adjacently lower-layer capsule pose. Such simple residual pose routing has two advantages: 1) reducing the routing computation complexity and 2) avoiding gradient vanishing due to its residual learning framework. On top of that, we explicitly reformulate the capsule layers by building a residual pose block. Stacking multiple such blocks results in a deep residual CapsNets (ResCaps) with a ResNet-like architecture. Results on MNIST, AffNIST, SmallNORB, and CIFAR-10/100 show the effectiveness of ResCaps for image classification. Furthermore, we successfully extend our residual pose routing to large-scale real-world applications, including 3-D object reconstruction and classification, and 2-D saliency dense prediction. The source code has been released on https://github.com/liuyi1989/ResCaps.
引用
收藏
页码:2648 / 2661
页数:14
相关论文
共 50 条
  • [1] Residual Vector Capsule: Improving Capsule by Pose Attention
    Xie, Ning
    Wan, Xiaoxia
    IEEE ACCESS, 2021, 9 : 129626 - 129634
  • [2] Capsule Networks with Routing Annealing
    Renzulli, Riccardo
    Tartaglione, Enzo
    Fiandrotti, Attilio
    Grangetto, Marco
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 529 - 540
  • [3] Self-Routing Capsule Networks
    Hahn, Taeyoung
    Pyeon, Myeongjang
    Kim, Gunhee
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Simplified Routing Mechanism for Capsule Networks
    Hollosi, Janos
    Ballagi, Aron
    Pozna, Claudiu Radu
    ALGORITHMS, 2023, 16 (07)
  • [5] Introducing Routing Uncertainty in Capsule Networks
    Ribeiro, Fabio De Sousa
    Leontidis, Georgios
    Kollias, Stefanos
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Capsule Networks Need an Improved Routing Algorithm
    Paik, Inyoung
    Kwak, Taeyeong
    Kim, Injung
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 489 - 502
  • [7] Capsule Graph Neural Networks with EM Routing
    Lei, Yu
    Zhang, Jing
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3191 - 3195
  • [8] A novel capsule network based on deep routing and residual learning
    Zhang, Jian
    Xu, Qinghai
    Guo, Lili
    Ding, Ling
    Ding, Shifei
    SOFT COMPUTING, 2023, 27 (12) : 7895 - 7906
  • [9] A novel capsule network based on deep routing and residual learning
    Jian Zhang
    Qinghai Xu
    Lili Guo
    Ling Ding
    Shifei Ding
    Soft Computing, 2023, 27 : 7895 - 7906
  • [10] Training Deep Capsule Networks with Residual Connections
    Gugglberger, Josef
    Peer, David
    Rodriguez-Sanchez, Antonio
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 541 - 552