A review of monocular visual odometry

被引:67
|
作者
He, Ming [1 ]
Zhu, Chaozheng [1 ]
Huang, Qian [2 ,3 ]
Ren, Baosen [4 ]
Liu, Jintao [1 ]
机构
[1] Army Engn Univ PLA, Coll Command & Control Engn, Nanjing, Peoples R China
[2] Hohai Univ, Coll Comp & Informat, Nanjing, Peoples R China
[3] Jilin Univ, Key Lab Symbol Computat & Knowledge Engn, Minist Educ, Changchun, Peoples R China
[4] State Grid Shandong Elect Power Maintenance Co, Linyi, Shandong, Peoples R China
来源
VISUAL COMPUTER | 2020年 / 36卷 / 05期
基金
国家重点研发计划;
关键词
Visual odometry; Multi-sensor data fusion; Machine learning; Visual SLAM; INERTIAL ODOMETRY; SLAM; NAVIGATION; VERSATILE; ROBUST;
D O I
10.1007/s00371-019-01714-6
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Monocular visual odometry provides more robust functions on navigation and obstacle avoidance for mobile robots than other visual odometries, such as binocular visual odometry, RGB-D visual odometry and basic odometry. This paper describes the problem of visual odometry and also determines the relationships between visual odometry and visual simultaneous localization and mapping (SLAM). The basic principle of visual odometry is expressed in the form of mathematics, specifically by incrementally solving the pose changes of two series of frames and further improving the odometry through global optimization. After analyzing the three main ways of implementing visual odometry, the state-of-the-art monocular visual odometries, including ORB-SLAM2, DSO and SVO, are also analyzed and compared in detail. The issues of robustness and real-time operations, which are generally of interest in the current visual odometry research, are discussed from the future development of the directions and trends. Furthermore, we present a novel framework for the implementation of next-generation visual odometry based on additional high-dimensional features, which have not been implemented in the relevant applications.
引用
收藏
页码:1053 / 1065
页数:13
相关论文
共 50 条
  • [31] Monocular Visual Odometry Using Fisheye Lens Cameras
    Aguiar, Andre
    Santos, Filipe
    Santos, Luis
    Sousa, Armando
    PROGRESS IN ARTIFICIAL INTELLIGENCE, PT II, 2019, 11805 : 319 - 330
  • [32] Inertial Monocular Visual Odometry Based on RUPF Algorithm
    Hou, Juanrou
    Wang, Zhanqing
    Zhang, Yanshun
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3885 - 3891
  • [33] Improving Monocular Visual Odometry Using Learned Depth
    Sun, Libo
    Yin, Wei
    Xie, Enze
    Li, Zhengrong
    Sun, Changming
    Shen, Chunhua
    IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (05) : 3173 - 3186
  • [34] Monocular Visual Odometry based on Inverse Perspective Mapping
    Cao Yu
    Feng Ying
    Yang Yun-tao
    Chen Yun-jin
    Lei Bing
    Zhao Li-shuang
    INTERNATIONAL SYMPOSIUM ON PHOTOELECTRONIC DETECTION AND IMAGING 2011: ADVANCES IN IMAGING DETECTORS AND APPLICATIONS, 2011, 8194
  • [35] MONOCULAR VISUAL ODOMETRY FOR IN-PIPE INSPECTION ROBOT
    Kadir, Herdawatie Abdul
    Arshad, M. R.
    Aghdam, Hamed Habibi
    Zaman, Munir
    JURNAL TEKNOLOGI, 2015, 74 (09): : 35 - 40
  • [36] Monocular Visual Odometry Initialization With Points and Line Segments
    Zhou, Hang
    Fan, Haiyan
    Peng, Keju
    Fan, Weihong
    Zhou, Dongxiang
    Liu, Yunhui
    IEEE ACCESS, 2019, 7 : 73120 - 73130
  • [37] Parallel, Real-Time Monocular Visual Odometry
    Song, Shiyu
    Chandraker, Manmohan
    Guest, Clark C.
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 4698 - 4705
  • [38] Fast and accurate visual odometry from a monocular camera
    Yang, Xin
    Xue, Tangli
    Luo, Hongcheng
    Guo, Jiabin
    FRONTIERS OF COMPUTER SCIENCE, 2019, 13 (06) : 1326 - 1336
  • [39] Monocular visual odometry based on improved Census transform
    Lin Z.-W.
    Li Q.-M.
    Wang X.-Y.
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2021, 55 (08): : 1500 - 1509
  • [40] A monocular visual inertial odometry based on structural features
    Yan L.
    Li C.
    Xu B.
    Xia Y.
    Xiao B.
    Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2023, 45 (10): : 3207 - 3217