A Calibration Method for Mobile Omnidirectional Vision Based on Structured Light

被引:7
|
作者
Meng, Ling [1 ]
Li, Yuan [1 ]
Wang, Qing Lin [1 ]
机构
[1] Beijing Inst Technol, State Key Lab Intelligent Control & Decis Complex, Sch Automat, Beijing 100081, Peoples R China
关键词
Cameras; Calibration; Robot vision systems; Three-dimensional displays; Mirrors; Mobile omnidirectional vision; structured light vision; calibration; vanishing points; CAMERA CALIBRATION;
D O I
10.1109/JSEN.2020.3012178
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Mobile omnidirectional structured light vision is increasingly used in scene perception and robot navigation. A wide range of information is obtained by means of the vision system by only one image and laser image features are detected and extracted easily and quickly. In this paper a novel calibration method for mobile omnidirectional camera based on structured light is presented. Firstly, a set of parallel laser planes is emitted on the walls of corridor as auxiliary targets by structured light and intersects with wall orthogonally. Secondly, the constraint relationship is analyzed between the vanishing points in fisheye images and intrinsic parameters of imaging model. Finally, effects of the laser stripes' interval and the angle between the wall which contains laser stripes and ground on calibration results are evaluated. Compared to Scaramuzza method, the calibration method shows its superiority in terms of both feasibility and efficiency. The method with the characteristic of self-calibration since the planar target is replaced by actively projected laser stripes. The result illustrates that our method has the advantages of simple and feasible operation, but result is effective and accurate. The calibration parameters are independent of the laser stripes' interval and the angle between the wall and ground. Therefore, the method of the mobile omnidirectional structured light vision presented in this paper can be applied to many areas.
引用
收藏
页码:11451 / 11460
页数:10
相关论文
共 50 条
  • [41] Line structured light calibration method based on circular targets
    Fu, Yanjun
    Deng, Xin
    Jiang, Guangyu
    Peng, Yuhui
    Liu, Kunpeng
    OPTICAL ENGINEERING, 2024, 63 (05) : 54107
  • [42] A calibration method for line-structured light using mirror-based virtual binocular vision system
    Yang, Pei
    Yang, Ziyi
    Zhang, Jin
    Xia, Haojie
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (10)
  • [43] Improvement of calibration method for multi-camera line structured light vision system
    Hongmai Yang
    Changshuai Fang
    Xiaodong Zhang
    Applied Physics B, 2022, 128
  • [44] Hand-Eye Calibration Method of Line Structured Light Vision Sensor Robot Based on Planar Target
    Wu Qinghua
    Qiu Jiefeng
    Li Zhiang
    Liu Jiacheng
    Wang Biao
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (10)
  • [45] Improvement of calibration method for multi-camera line structured light vision system
    Yang, Hongmai
    Fang, Changshuai
    Zhang, Xiaodong
    APPLIED PHYSICS B-LASERS AND OPTICS, 2022, 128 (06):
  • [46] On-Site Global Calibration of Mobile Vision Measurement System Based on Virtual Omnidirectional Camera Model
    Chai, Binhu
    Wei, Zhenzhong
    REMOTE SENSING, 2021, 13 (10)
  • [47] Omnidirectional Vision Based Mobile Robot Hierarchical SLAM
    Li Maohai
    Sun Lining
    Pan Mingqiang
    FUNCTIONAL MANUFACTURING TECHNOLOGIES AND CEEUSRO II, 2011, 464 : 95 - 98
  • [48] Omnidirectional Vision Based Mobile Robot Topological localization
    Li, Maohai
    Lin, Rui
    Wang, Zhenhua
    Hong, Yunbo
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2013, : 1905 - 1910
  • [49] Vision-Based Navigation of Omnidirectional Mobile Robots
    Ferro, Marco
    Paolillo, Antonio
    Cherubini, Andrea
    Vendittelli, Marilena
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (03) : 2691 - 2698
  • [50] Calibration of the omnidirectional vision sensor: SYCLOP
    Cauchois, Cyril
    Brassart, Eric
    Drocourt, Cyril
    Vasseur, Pascal
    Proceedings - IEEE International Conference on Robotics and Automation, 1999, 2 : 1287 - 1292