Mitigating RGB-D camera errors for robust ultrasonic inspections using a force-torque sensor

被引:1
|
作者
Tabatabaeipour, Morteza [1 ]
Jackson, William [2 ]
Gilmour, Adam [2 ]
Zhang, Dayi [2 ]
Poole, Alastair [2 ]
Tzaferis, Konstantinos [2 ]
Dobie, Gordon [2 ]
Gachagan, Anthony [2 ]
机构
[1] Ulster Univ, Sch Engn, Belfast, North Ireland
[2] Univ Strathclyde, Ctr Ultrason Engn CUE, Glasgow, Scotland
基金
英国工程与自然科学研究理事会;
关键词
Robotics in hazardous fields; RGB-D perception; force and tactile sensing; force control; phased array ultrasonic inspection; misalignment correction;
D O I
10.1080/10589759.2024.2386618
中图分类号
TB3 [工程材料学];
学科分类号
0805 ; 080502 ;
摘要
Robot-based phased array ultrasonic testing is widely used for precise defect detection, particularly in complex geometries and various materials. Compact robots with miniature arms can inspect constrained areas, but payload limitations restrict sensor choice. RGB-D cameras, due to their small size and light weight, capture RGB colour and depth data, creating colourised 3D point clouds for scene representation. These point clouds help estimate surface normals to align the ultrasound transducer on complex surfaces. However, sole reliance on RGB-D cameras can lead to inaccuracies, affecting ultrasonic beam direction and test results. This paper investigates the impact of transducer pose and RGB-D camera limitations on ultrasonic inspections and proposes a novel method using force-torque sensors to mitigate errors from inaccurately estimated normals from the camera. The force-torque sensor, integrated into the robot end effector, provides tactile feedback to the controller, enabling joint angle adjustments to correct errors in the estimated normal. Experimental results show the successful application of ultrasound transducers using this method, even with significant misalignment. Adjustments took approximately 4 seconds to correct deviations from 12.55 degrees, with an additional 4 seconds to ensure the probe was parallel to the surface, enhancing ultrasonic inspection accuracy in complex, constrained environments.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] Autonomous industrial assembly using force, torque, and RGB-D sensing
    Watson, James
    Miller, Austin
    Correll, Nikolaus
    ADVANCED ROBOTICS, 2020, 34 (7-8) : 546 - 559
  • [2] Robust Upper Limb Kinematic Reconstruction Using a RGB-D Camera
    Gioi, Salvatore Maria Li
    Loianno, Giuseppe
    Cordella, Francesca
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04) : 3831 - 3837
  • [3] Robust RGB-D face recognition using Kinect sensor
    Li, Billy Y. L.
    Xue, Mingliang
    Mian, Ajmal
    Liu, Wanquan
    Krishna, Aneesh
    NEUROCOMPUTING, 2016, 214 : 93 - 108
  • [4] Navigation System for Visually Impaired People Based on RGB-D Camera and Ultrasonic Sensor
    Hakim, Heba
    Fadhil, Ali
    INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY (ICICT 2019), 2019, : 172 - 177
  • [5] Robust Object Tracking based on RGB-D Camera
    Qi, Wenjing
    Yang, Yinfei
    Yi, Meng
    Li, Yunfeng
    Pizlo, Zygmunt
    Latecki, Longin Jan
    2014 11TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2014, : 2873 - 2878
  • [6] Robust Tracking and Mapping with a Handheld RGB-D Camera
    Lee, Kyoung-Rok
    Truong Nguyen
    2014 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2014, : 1120 - 1127
  • [7] Robust 3D Reconstruction With an RGB-D Camera
    Wang, Kangkan
    Zhang, Guofeng
    Bao, Hujun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2014, 23 (11) : 4893 - 4906
  • [8] A Cooking Support System with Force Visualization Using Force Sensors and an RGB-D Camera
    Totsu, Nobuhiro
    Sakaino, Sho
    Tsuji, Toshiaki
    HAPTIC INTERACTION: SCIENCE, ENGINEERING AND DESIGN, 2018, 432 : 297 - 299
  • [9] DIAGNOSTIC SYSTEM FOR ROBOT USING A FORCE-TORQUE SENSOR
    MITSUISHI, M
    ROBOTERSYSTEME, 1989, 5 (01): : 40 - 46
  • [10] Multiple Sensor Synchronization with theRealSense RGB-D Camera
    Yoon, Hyunse
    Jang, Mingyu
    Huh, Jungwoo
    Kang, Jiwoo
    Lee, Sanghoon
    SENSORS, 2021, 21 (18)