The next-best-view for workpiece localization in robot workspace

被引:2
|
作者
Hu, Jie [1 ]
Pagilla, Prabhakar R. [1 ]
Darbha, Swaroop [1 ]
机构
[1] Texas A&M Univ, Dept Mech Engn, College Stn, TX 77843 USA
关键词
workpiece localization; robotics; manufacturing; next-best-view;
D O I
10.1109/AIM46487.2021.9517657
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Workpiece localization is the process of obtaining the location of a workpiece in a reference frame of a robotic workspace. The location (position and orientation) is represented by the transformation between a local frame associated with the workpiece and the specified reference frame in the workspace. In this work, we study the workpiece localization problem without the two commonly adopted restrictive assumptions: the data used to calculate the transformation is readily available and the correspondence between the data sets used for calculation is known. The goal is to automate the localization process starting from efficient data collection to determining the workpiece location in the workspace. We describe a strategy that includes the following aspects: predicting the correspondence between the measured data and the workpiece CAD model data; generating representative vectors that would aid in determining the next-best-view for collecting new information of the workpiece location; evaluating a search region to find the next sensor location that satisfies both the robot kinematics as well as sensor field-of-view constraints while giving the maximum view gain; and calculating the rigid body transformation from the local frame to the world frame to localize the workpiece. Numerical simulation and experimental results are presented and discussed for the proposed strategy.
引用
收藏
页码:1201 / 1206
页数:6
相关论文
共 50 条
  • [31] Using an attributed 2D-grid for next-best-view planning on 3D environment data for an autonomous robot
    Strand, Marcus
    Dillmann, Ruediger
    2008 INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION, VOLS 1-4, 2008, : 314 - 319
  • [32] Next-Best-View Planning for 3D Object Reconstruction under Positioning Error
    Irving Vasquez, Juan
    Enrique Sucar, L.
    ADVANCES IN ARTIFICIAL INTELLIGENCE, PT I, 2011, 7094 : 429 - 442
  • [33] A Shadowcasting-Based Next-Best-View Planner for Autonomous 3D Exploration
    Batinovic, Ana
    Ivanovic, Antun
    Petrovic, Tamara
    Bogdan, Stjepan
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02): : 2969 - 2976
  • [34] Volumetric Next-best-view Planning for 3D Object Reconstruction with Positioning Error
    Vasquez-Gomez, J. Irving
    Sucar, L. Enrique
    Murrieta-Cid, Rafael
    Lopez-Damian, Efrain
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2014, 11
  • [35] Feature-based Next-Best-View Selection for Scanning 3D Objects
    Hiriwatanawong, Sawakarn
    Kondo, Toshiaki
    Kongprawechnon, Waree
    Leelasawassuk, Teesid
    2024 21ST INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING/ELECTRONICS, COMPUTER, TELECOMMUNICATIONS AND INFORMATION TECHNOLOGY, ECTI-CON 2024, 2024,
  • [36] Multi-Sensor Next-Best-View Planning as Matroid-Constrained Submodular Maximization
    Lauri, Mikko
    Pajarinen, Joni
    Peters, Jan
    Frintrop, Simone
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (04) : 5323 - 5330
  • [37] Assessment of next-best-view algorithms performance with various 3D scanners and manipulator
    Karaszewski, M.
    Adamczyk, M.
    Sitnik, R.
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2016, 119 : 320 - 333
  • [38] Active Implicit Object Reconstruction Using Uncertainty-Guided Next-Best-View Optimization
    Yan D.
    Liu J.
    Quan F.
    Chen H.
    Fu M.
    IEEE Robotics and Automation Letters, 2023, 8 (10) : 6395 - 6402
  • [39] A Qualitative Comparison of the State-of-the-Art Next-Best-View Planners for 3D Scanning
    Aristovs, Andrejs
    Urtans, Evalds
    BALTIC JOURNAL OF MODERN COMPUTING, 2025, 13 (01): : 157 - 165
  • [40] A Multi-Sensor Next-Best-View Framework for Geometric Model-Based Robotics Applications
    Cui, Jinda
    Wen, John T.
    Trinkle, Jeff
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 8769 - 8775