A framework for the fusion of visual and tactile modalities for improving robot perception

被引:10
|
作者
Zhang, Wenchang [1 ,2 ]
Sun, Fuchun [1 ]
Wu, Hang [2 ]
Yang, Haolin [1 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Beijing 100084, Peoples R China
[2] Inst Med Equipment, Tianjin 300161, Peoples R China
基金
中国国家自然科学基金;
关键词
multi-modal fusion; robot perception; vision; tactile; classification; SPARSE REPRESENTATION; CLASSIFICATION;
D O I
10.1007/s11432-016-0158-2
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Robots should ideally perceive objects using human-like multi-modal sensing such as vision, tactile feedback, smell, and hearing. However, the features presentations are different for each modal sensor. Moreover, the extracted feature methods for each modal are not the same. Some modal features such as vision, which presents a spatial property, are static while features such as tactile feedback, which presents temporal pattern, are dynamic. It is difficult to fuse these data at the feature level for robot perception. In this study, we propose a framework for the fusion of visual and tactile modal features, which includes the extraction of features, feature vector normalization and generation based on bag-of-system (BoS), and coding by robust multi-modal joint sparse representation (RM-JSR) and classification, thereby enabling robot perception to solve the problem of diverse modal data fusion at the feature level. Finally, comparative experiments are carried out to demonstrate the performance of this framework.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Visual Tactile Fusion Object Clustering
    Zhang, Tao
    Cong, Yang
    Sun, Gan
    Wang, Qianqian
    Ding, Zhenming
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 10426 - 10433
  • [22] An Active Strategy for Safe Human-Robot Interaction Based on Visual-Tactile Perception
    Xu, Caiyue
    Zhou, Yanmin
    He, Bin
    Wang, Zhipeng
    Zhang, Changming
    Sang, Hongrui
    Liu, Hao
    IEEE SYSTEMS JOURNAL, 2023, 17 (04): : 5555 - 5566
  • [23] Robot Tactile Sensing: Vision Based Tactile Sensor for Force Perception
    Zhang, Tao
    Cong, Yang
    Li, Xiaomao
    Peng, Yan
    2018 IEEE 8TH ANNUAL INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS (IEEE-CYBER), 2018, : 1360 - 1365
  • [24] Fusion of Non-Visual Modalities Into the Probabilistic Occupancy Map Framework for Person Localization
    Mandeljc, Rok
    Pers, Janez
    Kristan, Matej
    Kovacic, Stanislav
    2011 FIFTH ACM/IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED SMART CAMERAS (ICDSC), 2011,
  • [25] Tactile-GAT: tactile graph attention networks for robot tactile perception classification
    Chen, Lun
    Zhu, Yingzhao
    Li, Man
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [26] A Biomimetic Ionic Hydrogel Synapse for Self-Powered Tactile-Visual Fusion Perception
    Wu, Jie
    Zhang, Lei
    Chang, Wenbo
    Zhang, Hongjie
    Zhang, Wei
    Mei, Tingting
    Zhu, Xinyi
    Wang, Li
    Zhang, Mingming
    Xiao, Kai
    ADVANCED FUNCTIONAL MATERIALS, 2025,
  • [27] Visual-Tactile Perception of Biobased Composites
    Thundathil, Manu
    Nazmi, Ali Reza
    Shahri, Bahareh
    Emerson, Nick
    Muessig, Joerg
    Huber, Tim
    MATERIALS, 2023, 16 (05)
  • [28] Tactile stimulation can suppress visual perception
    Masakazu Ide
    Souta Hidaka
    Scientific Reports, 3
  • [29] Effective Tactile Noise Facilitates Visual Perception
    Lugo, J. E.
    Doti, R.
    Faubert, J.
    SEEING AND PERCEIVING, 2012, 25 (01): : 29 - 44
  • [30] Visual and tactile perception techniques for braille recognition
    Park, Byeong-Sun
    Im, Seong-Min
    Lee, Hojun
    Lee, Young Tack
    Nam, Changjoo
    Hong, Sungeun
    Kim, Min-gu
    MICRO AND NANO SYSTEMS LETTERS, 2023, 11 (01)