Real-time 3D pose estimation of small ring-shaped bin-picking objects using deep learning and ICP algorithm

被引:2
|
作者
Lee J. [1 ]
Lee M. [1 ]
Kang S.-S. [2 ]
Park S.-Y. [1 ]
机构
[1] School of Electronics Engineering, Kyungpook National University
[2] Intelligent Robotics Research, Division Electronics and Telecommunications Research Institute
关键词
Bin picking; Deep learning; Object detection; Pose estimation;
D O I
10.5302/J.ICROS.2019.19.0109
中图分类号
学科分类号
摘要
Bin picking is an important task in smart manufacturing and intelligent robotics. For a robot to pick or grip an object with a human-like gripping action, it needs to know the accurate 3D pose of the object. In this paper, we propose a method for estimating the 3D pose of a small ring-shaped object using infrared and depth images generated by a depth camera. The proposed method consists of two algorithm modules, the first to recognize an object in a 2D infrared image and the second to estimate the 3D pose by applying the ICP (iterative closest point) algorithm to 3D depth data. In the first module, we propose a method to generate a three-channel integrated image with features from the depth and infrared images. Next, we introduce a method for training an object detector based on deep-learning. Because the bin-picking test object in this paper is small and ring-shaped, it is difficult to detect and find 3D poses of individual objects when many such objects are piled up. We solved this problem with a depth-based filtering method. Using the filtered image, each object region is separated by the deep learning approach. In the second module, the ICP algorithm is employed to estimate the 3D pose of the ring object. We match a 3D reference model of the object and the real object using the point-to-point ICP algorithm. Performance of the proposed method is evaluated by using two different types of depth camera in the experiments. © ICROS 2019.
引用
收藏
页码:760 / 769
页数:9
相关论文
共 50 条
  • [31] Using 3D Models for Real-Time Facial Feature Tracking, Pose Estimation, and Expression Monitoring
    Caunce, Angela
    Cootes, Tim
    COMPUTER VISION - ECCV 2012, PT III, 2012, 7585 : 651 - 654
  • [32] 3D HUMAN POSE ESTIMATION USING STOCHASTIC OPTIMIZATION IN REAL TIME
    Handrich, Sebastian
    Waxweiler, Philipp
    Werner, Philipp
    Al-Hamadi, Ayoub
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 555 - 559
  • [33] Real-time head tracking and 3D pose estimation from range data
    Malassiotis, S
    Strintzis, MG
    2003 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL 2, PROCEEDINGS, 2003, : 859 - 862
  • [34] Real-time upper body detection and 3D pose estimation in monoscopic images
    Micilotta, Antonio S.
    Ong, Eng-Jon
    Bowden, Richard
    COMPUTER VISION - ECCV 2006, PT 3, PROCEEDINGS, 2006, 3953 : 139 - 150
  • [35] Real-time pose estimation and motion tracking for motion performance using deep learning models
    Liu, Long
    Dai, Yuxin
    Liu, Zhihao
    JOURNAL OF INTELLIGENT SYSTEMS, 2024, 33 (01)
  • [36] FLK: A filter with learned kinematics for real-time 3D human pose estimation
    Martini, Enrico
    Boldo, Michele
    Bombieri, Nicola
    SIGNAL PROCESSING, 2024, 224
  • [37] Evaluation of mouse behavioral responses to nutritive versus nonnutritive sugar using a deep learning-based 3D real-time pose estimation system
    Kim, Jineun
    Kim, Dae-gun
    Jung, Wongyo
    Suh, Greg S. B.
    JOURNAL OF NEUROGENETICS, 2023, 37 (1-2) : 78 - 83
  • [38] Real-time 3D human pose estimation without skeletal a priori structures
    Bai, Guihu
    Luo, Yanmin
    Pan, Xueliang
    Wang, Jia
    Guo, Jing-Ming
    IMAGE AND VISION COMPUTING, 2023, 132
  • [39] Faster VoxelPose: Real-time 3D Human Pose Estimation by Orthographic Projection
    Ye, Hang
    Zhu, Wentao
    Wang, Chunyu
    Wu, Rujie
    Wang, Yizhou
    COMPUTER VISION - ECCV 2022, PT VI, 2022, 13666 : 142 - 159
  • [40] Achieving Hard Real-Time Capability for 3D Human Pose Estimation Systems
    Schlosser, Patrick
    Ledermann, Christoph
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 3772 - 3778