The next-best-view for workpiece localization in robot workspace

被引:2
|
作者
Hu, Jie [1 ]
Pagilla, Prabhakar R. [1 ]
Darbha, Swaroop [1 ]
机构
[1] Texas A&M Univ, Dept Mech Engn, College Stn, TX 77843 USA
关键词
workpiece localization; robotics; manufacturing; next-best-view;
D O I
10.1109/AIM46487.2021.9517657
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Workpiece localization is the process of obtaining the location of a workpiece in a reference frame of a robotic workspace. The location (position and orientation) is represented by the transformation between a local frame associated with the workpiece and the specified reference frame in the workspace. In this work, we study the workpiece localization problem without the two commonly adopted restrictive assumptions: the data used to calculate the transformation is readily available and the correspondence between the data sets used for calculation is known. The goal is to automate the localization process starting from efficient data collection to determining the workpiece location in the workspace. We describe a strategy that includes the following aspects: predicting the correspondence between the measured data and the workpiece CAD model data; generating representative vectors that would aid in determining the next-best-view for collecting new information of the workpiece location; evaluating a search region to find the next sensor location that satisfies both the robot kinematics as well as sensor field-of-view constraints while giving the maximum view gain; and calculating the rigid body transformation from the local frame to the world frame to localize the workpiece. Numerical simulation and experimental results are presented and discussed for the proposed strategy.
引用
收藏
页码:1201 / 1206
页数:6
相关论文
共 50 条
  • [1] Next-Best-View Selection for Robot Eye-in-Hand Calibration
    Yang, Jun
    Rebello, Jason
    Waslander, Steven L.
    2023 20TH CONFERENCE ON ROBOTS AND VISION, CRV, 2023, : 161 - 168
  • [2] A Next-Best-View Algorithm for Autonomous 3D Object Modeling by a Humanoid Robot
    Foissotte, T.
    Stasse, O.
    Escande, A.
    Kheddar, A.
    2008 8TH IEEE-RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS 2008), 2008, : 515 - +
  • [3] A Double Branch Next-Best-View Network and Novel Robot System for Active Object Reconstruction
    Han, Yiheng
    Zhan, Irvin Haozhe
    Zhao, Wang
    Liu, Yong-Jin
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 7306 - 7312
  • [4] Online Next-Best-View Planner for 3D-Exploration and Inspection With a Mobile Manipulator Robot
    Naazare, Menaka
    Rosas, Francisco Garcia
    Schulz, Dirk
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 3779 - 3786
  • [5] Next-Best-View selection from observation viewpoint statistics
    Aravecchia, Stephanie
    Richard, Antoine
    Clausel, Marianne
    Pradalier, Cedric
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 10505 - 10510
  • [6] A next-best-view method for automatic Modeling of three dimensional objects
    He, Bingwei
    Li, Y. F.
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13E : 104 - 109
  • [7] GPU-Accelerated Next-Best-View Coverage of Articulated Scenes
    Osswald, Stefan
    Bennewitz, Maren
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 8315 - 8322
  • [8] Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter
    Morrison, Douglas
    Corke, Peter
    Leitner, Jurgen
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 8762 - 8768
  • [9] Hierarchical Ray Tracing For Fast Volumetric Next-Best-View Planning
    Irving Vasquez-Gomez, J.
    Enrique Sucar, L.
    Murrieta-Cid, Rafael
    2013 INTERNATIONAL CONFERENCE ON COMPUTER AND ROBOT VISION (CRV), 2013, : 181 - 187
  • [10] NEXT-BEST-VIEW METHOD BASED ON CONSECUTIVE EVALUATION OF TOPOLOGICAL RELATIONS
    Dierenbach, K. O.
    Weinmann, M.
    Jutzi, B.
    XXIII ISPRS CONGRESS, COMMISSION III, 2016, 41 (B3): : 11 - 19