A framework for the fusion of visual and tactile modalities for improving robot perception

被引:10
|
作者
Zhang, Wenchang [1 ,2 ]
Sun, Fuchun [1 ]
Wu, Hang [2 ]
Yang, Haolin [1 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Beijing 100084, Peoples R China
[2] Inst Med Equipment, Tianjin 300161, Peoples R China
基金
中国国家自然科学基金;
关键词
multi-modal fusion; robot perception; vision; tactile; classification; SPARSE REPRESENTATION; CLASSIFICATION;
D O I
10.1007/s11432-016-0158-2
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Robots should ideally perceive objects using human-like multi-modal sensing such as vision, tactile feedback, smell, and hearing. However, the features presentations are different for each modal sensor. Moreover, the extracted feature methods for each modal are not the same. Some modal features such as vision, which presents a spatial property, are static while features such as tactile feedback, which presents temporal pattern, are dynamic. It is difficult to fuse these data at the feature level for robot perception. In this study, we propose a framework for the fusion of visual and tactile modal features, which includes the extraction of features, feature vector normalization and generation based on bag-of-system (BoS), and coding by robust multi-modal joint sparse representation (RM-JSR) and classification, thereby enabling robot perception to solve the problem of diverse modal data fusion at the feature level. Finally, comparative experiments are carried out to demonstrate the performance of this framework.
引用
收藏
页数:12
相关论文
共 50 条
  • [11] Sensorimotor synchronization with visual, auditory, and tactile modalities
    Whitton, Simon Andrew
    Jiang, Fang
    PSYCHOLOGICAL RESEARCH-PSYCHOLOGISCHE FORSCHUNG, 2023, 87 (07): : 2204 - 2217
  • [12] Sensorimotor synchronization with visual, auditory, and tactile modalities
    Simon Andrew Whitton
    Fang Jiang
    Psychological Research, 2023, 87 : 2204 - 2217
  • [13] Grasp State Assessment of Deformable Objects Using Visual-Tactile Fusion Perception
    Cui, Shaowei
    Wang, Rui
    Wei, Junhang
    Li, Fanrong
    Wang, Shuo
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 538 - 544
  • [14] Visual tracking modalities for a companion robot
    Menezes, Paulo
    Lerasle, Frederic
    Dias, Jorge
    2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12, 2006, : 5363 - +
  • [15] Visual effects on tactile texture perception
    Roberts, Roberta D.
    Li, Min
    Allen, Harriet A.
    SCIENTIFIC REPORTS, 2024, 14 (01)
  • [16] Visual effects on tactile texture perception
    Roberta D. Roberts
    Min Li
    Harriet A. Allen
    Scientific Reports, 14
  • [17] Tactile and visual contributions to the perception of naturalness
    Overvliet, K. E.
    Soto-Faraco, S.
    PERCEPTION, 2008, 37 : 133 - 133
  • [18] Effects of Fusion between Tactile and Proprioceptive Inputs on Tactile Perception
    Warren, Jay P.
    Santello, Marco
    Tillery, Stephen I. Helms
    PLOS ONE, 2011, 6 (03):
  • [19] Improving Visual Perception of a Social Robot for Controlled and In-the-wild Human-robot Interaction
    Zhong, Wangjie
    Tian, Leimin
    Le, Duy Tho
    Rezatofghi, Hamid
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1199 - 1203
  • [20] Elastic Tactile Simulation Towards Tactile-Visual Perception
    Wang, Yikai
    Huang, Wenbing
    Fang, Bin
    Sun, Fuchun
    Li, Chang
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2690 - 2698