Tracking Object's Pose via Dynamic Tactile Interaction

被引:0
|
作者
Lin, Qiguang [1 ]
Yan, Chaojie [2 ]
Li, Qiang [3 ]
Ling, Yonggen [4 ]
Lee, Wangwei [4 ]
Zheng, Yu [4 ]
Wan, Zhaoliang [5 ]
Huang, Bidan [4 ]
Liu, Xiaofeng [1 ]
机构
[1] Hohai Univ, Coll IoT Engn, Jiangsu Key Lab Special Robot Technol, Changzhou 213022, Jiangsu, Peoples R China
[2] Zhejiang Univ, Inst Cyber Syst & Control, State Key Lab Ind Control & Technol, R China, Hangzhou, Peoples R China
[3] Shenzhen Technol Univ, Coll Big Data & Internet, Shenzhen 518118, Peoples R China
[4] Tencent Robot X, Shenzhen, Peoples R China
[5] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Tactile perception; robotic grasping; extended Kalman filter;
D O I
10.1142/S0219843623500214
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
It is a challenging task to localize and track an in-hand object in robotic domain. Researchers were mainly using the vision as major modality for extracting object's pose. The vision approaches are fragile when the object is occluded by the robotic arm and hand. To this end, we propose a tactile-based DTI-Tracker (tracking object's pose via Dynamic Tactile Interaction) approach and formalize the object's tracking as a filter problem. An Extended Kalman Filter (EKF) is used to estimate the in-hand object pose exploiting the high spatial resolution tactile feedback. Given the initial estimation error, the proposed approach rapidly converges the estimation result to the real pose and the statistic evaluation shows the robustness of the proposed approach. We evaluate this method in physics simulation and real multi-fingered grasping setup while the object is static and movable. The proposed method is a potential tool to foster future research on dexterous manipulation using multifingered robotic hand.
引用
收藏
页数:22
相关论文
共 50 条
  • [1] Tactile control for object tracking and dynamic contour following
    Aquilina, Kirsty
    Barton, David A. W.
    Lepora, Nathan F.
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2024, 178
  • [2] Global estimation of an object's pose using tactile sensing
    Bimbo, Joao
    Kormushev, Petar
    Althoefer, Kaspar
    Liu, Hongbin
    ADVANCED ROBOTICS, 2015, 29 (05) : 363 - 374
  • [3] Enhancing Generalizable 6D Pose Tracking of an In-Hand Object With Tactile Sensing
    Liu, Yun
    Xu, Xiaomeng
    Chen, Weihang
    Yuan, Haocheng
    Wang, He
    Xu, Jing
    Chen, Rui
    Yi, Li
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (02) : 1106 - 1113
  • [4] Visual and Tactile Fusion for Estimating the Pose of a Grasped Object
    Alvarez, David
    Roa, Maximo A.
    Moreno, Luis
    FOURTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, ROBOT 2019, VOL 2, 2020, 1093 : 184 - 198
  • [5] Dynamic Modeling of Hand-Object Interactions via Tactile Sensing
    Zhang, Qiang
    Li, Yunzhu
    Luo, Yiyue
    Shou, Wan
    Foshey, Michael
    Yan, Junchi
    Tenenbaum, Joshua B.
    Matusik, Wojciech
    Torralba, Antonio
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 2874 - 2881
  • [6] Pose Selection for Underwater Object Detection, Pose Estimation, and Tracking
    Teigland, Hakon
    Hassani, Vahid
    Tore Moller, Ments
    IEEE ACCESS, 2024, 12 : 142331 - 142342
  • [7] Object tracking via the dynamic velocity hough transform
    Lappas, P
    Carter, JN
    Damper, RI
    2001 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL II, PROCEEDINGS, 2001, : 371 - 374
  • [8] Tac2Pose: Tactile object pose estimation from the first touch
    Bauza, Maria
    Bronars, Antonia
    Rodriguez, Alberto
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2023, 42 (13): : 1185 - 1209
  • [9] Tac2Pose: Tactile object pose estimation from the first touch
    Massachusetts Institute of Technology, Cambridge
    MA, United States
    Int J Rob Res, 13 (1185-1209):
  • [10] Object pose estimation by iterative contacts with soft tactile sensor
    Kato, Daisuke
    Kobayashi, Yuichi
    Yagi, Hiraku
    Miyazawa, Noritsugu
    Hara, Kosuke
    Usui, Dotaro
    ADVANCED ROBOTICS, 2024, 38 (15) : 1009 - 1023