CasOmniMVS: Cascade Omnidirectional Depth Estimation with Dynamic Spherical Sweeping

被引:0
|
作者
Wang, Pinzhi [1 ]
Li, Ming [1 ]
Cao, Jinghao [1 ]
Du, Sidan [1 ]
Li, Yang [1 ]
机构
[1] Nanjing Univ, Sch Elect Sci & Engn, Nanjing 210046, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 02期
关键词
omnidirectional depth estimation; cascade architecture; dynamic spherical sweeping;
D O I
10.3390/app14020517
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Estimating 360 circle depth from multiple cameras has been a challenging problem. However, existing methods often adopt a fixed-step spherical sweeping approach with densely sampled spheres and use numerous 3D convolutions in networks, which limits the speed of algorithms in practice. Additionally, obtaining high-precision depth maps of real scenes poses a challenge for the existing algorithms. In this paper, we design a cascade architecture using a dynamic spherical sweeping method that progressively refines the depth estimation from coarse to fine over multiple stages. The proposed method adaptively adjusts sweeping intervals and ranges based on the predicted depth and the uncertainty from the previous stage, resulting in a more efficient cost aggregation performance. The experimental results demonstrated that our method achieved state-of-the-art accuracy with reduced GPU memory usage and time consumption compared to the other methods. Furthermore, we illustrate that our method achieved satisfactory performance on real-world data, despite being trained on synthetic data, indicating its generalization potential and practical applicability.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Spherical View Synthesis for Self-Supervised 360° Depth Estimation
    Zioulis, Nikolaos
    Karakottas, Antonis
    Zarpalas, Dimitrios
    Alvarez, Federico
    Daras, Petros
    2019 INTERNATIONAL CONFERENCE ON 3D VISION (3DV 2019), 2019, : 690 - 699
  • [32] PATCH-SWEEPING WITH ROBUST PRIOR FOR HIGH PRECISION DEPTH ESTIMATION IN REAL-TIME SYSTEMS
    Waizenegger, Wolfgang
    Atzpadin, Nicole
    Schreer, Oliver
    Feldmann, Ingo
    2011 18TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2011, : 881 - 884
  • [33] Dynamic Fusion Network for Light Field Depth Estimation
    Zhang, Yukun
    Piao, Yongri
    Ji, Xinxin
    Zhang, Miao
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2021, PT II, 2021, 13020 : 3 - 15
  • [34] Dense Monocular Depth Estimation in Complex Dynamic Scenes
    Ranftl, Rene
    Vineetl, Vibhav
    Chen, Qifeng
    Koltun, Vladlen
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 4058 - 4066
  • [35] Temporally Consistent Online Depth Estimation in Dynamic Scenes
    Li, Zhaoshuo
    Ye, Wei
    Wang, Dilin
    Creighton, Francis X.
    Taylor, Russell H.
    Venkatesh, Ganesh
    Unberath, Mathias
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 3017 - 3026
  • [36] Visual attention-aware quality estimation framework for omnidirectional video using spherical Voronoi diagram
    Simone Croci
    Cagri Ozcinar
    Emin Zerman
    Sebastian Knorr
    Julián Cabrera
    Aljosa Smolic
    Quality and User Experience, 2020, 5 (1)
  • [37] Depth Estimation with Cascade Occlusion Culling Filter for Light-field Cameras
    Zhou, Wenhui
    Lumsdaine, Andrew
    Lin, Lili
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 1887 - 1892
  • [38] Monocular 360 Depth Estimation via Spherical Fully-Connected CRFs
    Cao, Zidong
    Wang, Lin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2025, 10 (02): : 1409 - 1416
  • [39] Unsupervised monocular depth estimation with omnidirectional camera for 3D reconstruction of grape berries in the wild
    Tamura, Yasuto
    Utsumi, Yuzuko
    Miwa, Yuka
    Iwamura, Masakazu
    Kise, Koichi
    PLOS ONE, 2025, 20 (02):
  • [40] Upright and Stabilized Omnidirectional Depth Estimation for Wide-baseline Multi-camera Inertial Systems
    Won, Changhee
    Seok, Hochang
    Lim, Jongwoo
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 2689 - 2692