Using human gaze in few-shot imitation learning for robot manipulation

被引:1
|
作者
Hamano, Shogo [1 ]
Kim, Heecheol [1 ]
Ohmura, Yoshiyuki [1 ]
Kuniyoshi, Yasuo [1 ]
机构
[1] Univ Tokyo, Grad Sch Informat Sci & Technol, Lab Intelligent Syst & Informat, Bunkyo Ku, 7-3-1 Hongo, Tokyo, Japan
来源
2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2022年
关键词
Imitation Learning; Deep Learning in Grasping and Manipulation; Few-shot Learning; Meta-learning; Telerobotics and Teleoperation;
D O I
10.1109/IROS47612.2022.9981706
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Imitation learning has attracted attention as a method for realizing complex robot control without programmed robot behavior. Meta-imitation learning has been proposed to solve the high cost of data collection and low generalizability to new tasks that imitation learning suffers from. Meta-imitation can learn new tasks involving unknown objects from a small amount of data by learning multiple tasks during training. However, meta-imitation learning, especially using images, is still vulnerable to changes in the background, which occupies a large portion of the input image. This study introduces a human gaze into meta-imitation learning-based robot control. We created a model with model-agnostic meta-learning to predict the gaze position from the image by measuring the gaze with an eye tracker in the head-mounted display. Using images around the predicted gaze position as an input makes the model robust to changes in visual information. We experimentally verified the performance of the proposed method through picking tasks using a simulated robot. The results indicate that our proposed method has a greater ability than the conventional method to learn a new task from only 9 demonstrations even if the object's color or the background pattern changes between the training and test.
引用
收藏
页码:8622 / 8629
页数:8
相关论文
共 50 条
  • [21] Ornament image retrieval using few-shot learning
    Sk Maidul Islam
    Subhankar Joardar
    Arif Ahmed Sekh
    International Journal of Multimedia Information Retrieval, 2023, 12
  • [22] An Intrusion Detection Method Using Few-Shot Learning
    Yu, Yingwei
    Bian, Naizheng
    IEEE ACCESS, 2020, 8 (08): : 49730 - 49740
  • [23] Relative Performance Prediction using Few-Shot Learning
    Dey, Arunavo
    Dhakal, Aakash
    Islam, Tanzima Z.
    Yeom, Jae-Seung
    Patki, Tapasya
    Nichols, Daniel
    Movsesyan, Alexander
    Bhatele, Abhinav
    2024 IEEE 48TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC 2024, 2024, : 1764 - 1769
  • [24] Enhancing few-shot learning using targeted mixup
    Darkwah Jr, Yaw
    Kang, Dae-Ki
    APPLIED INTELLIGENCE, 2025, 55 (04)
  • [25] Malware Classification Using Few-Shot Learning Approach
    Alfarsi, Khalid
    Rasheed, Saim
    Ahmad, Iftikhar
    INFORMATION, 2024, 15 (11)
  • [26] Anime Character Colorization using Few-shot Learning
    Maejima, Akinobu
    Kubo, Hiroyuki
    Shinagawa, Seitaro
    Funatomi, Takuya
    Yotsukura, Tatsuo
    Nakamura, Satoshi
    Mukaigawa, Yasuhiro
    PROCEEDINGS OF SIGGRAPH ASIA 2021 TECHNICAL COMMUNICATIONS, 2021,
  • [27] Automated human cell classification in sparse datasets using few-shot learning
    Reece Walsh
    Mohamed H. Abdelpakey
    Mohamed S. Shehata
    Mostafa M. Mohamed
    Scientific Reports, 12
  • [28] Automated human cell classification in sparse datasets using few-shot learning
    Walsh, Reece
    Abdelpakey, Mohamed H.
    Shehata, Mohamed S.
    Mohamed, Mostafa M.
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [29] Ornament image retrieval using few-shot learning
    Islam, Sk Maidul
    Joardar, Subhankar
    Sekh, Arif Ahmed
    INTERNATIONAL JOURNAL OF MULTIMEDIA INFORMATION RETRIEVAL, 2023, 12 (02)
  • [30] Few-Shot Visual Grounding for Natural Human-Robot Interaction
    Tziafas, Giorgos
    Kasaei, Hamidreza
    2021 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC), 2021, : 50 - 55