Robot Learning of Everyday Object Manipulations via Human Demonstration

被引:18
|
作者
Dang, Hao [1 ]
Allen, Peter K. [1 ]
机构
[1] Columbia Univ, Dept Comp Sci, New York, NY 10027 USA
关键词
D O I
10.1109/IROS.2010.5651244
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We deal with the problem of teaching a robot to manipulate everyday objects through human demonstration. We first design a task descriptor which encapsulates important elements of a task. The design originates from observations that manipulations involved in many everyday object tasks can be considered as a series of sequential rotations and translations, which we call manipulation primitives. We then propose a method that enables a robot to decompose a demonstrated task into sequential manipulation primitives and construct a task descriptor. We also show how to transfer a task descriptor learned from one object to similar objects. In the end, we argue that this framework is highly generic. Particularly, it can be used to construct a robot task database that serves as a manipulation knowledge base for a robot to succeed in manipulating everyday objects.
引用
收藏
页码:1284 / 1289
页数:6
相关论文
共 50 条
  • [1] An Object Attribute Guided Framework for Robot Learning Manipulations from Human Demonstration Videos
    Zhang, Qixiang
    Chen, Junhong
    Liang, Dayong
    Liu, Huaping
    Zhou, Xiaojing
    Ye, Zihan
    Liu, Wenyin
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 6113 - 6119
  • [2] Human–robot skill transmission for mobile robot via learning by demonstration
    Jiehao Li
    Junzheng Wang
    Shoukun Wang
    Chenguang Yang
    Neural Computing and Applications, 2023, 35 : 23441 - 23451
  • [3] Human-robot skill transmission for mobile robot via learning by demonstration
    Li, Jiehao
    Wang, Junzheng
    Wang, Shoukun
    Yang, Chenguang
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (32): : 23441 - 23451
  • [4] Online Object and Task Learning via Human Robot Interaction
    Dehghan, Masood
    Zhang, Zichen
    Siam, Mennatullah
    Jin, Jun
    Petrich, Laura
    Jagersand, Martin
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 2132 - 2138
  • [5] An Intuitive Robot Learning from Human Demonstration
    Ogenyi, Uchenna Emeoha
    Zhang, Gongyue
    Yang, Chenguang
    Ju, Zhaojie
    Liu, Honghai
    INTELLIGENT ROBOTICS AND APPLICATIONS (ICIRA 2018), PT I, 2018, 10984 : 176 - 185
  • [6] Embodied AR Language Learning Through Everyday Object Interactions: A Demonstration of EARLL
    Lee, Jaewook
    Kim, Sieun
    Park, Minji
    Rasgaitis, Catherine L.
    Froehlich, Jon E.
    PROCEEDINGS OF THE 37TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, UIST ADJUNCT 2024, 2024,
  • [7] Learning Grasping for Robot with Parallel Gripper from Human Demonstration via Contact Analysis
    Zhang, Zhengshen
    Liu, Chenchen
    Zhou, Lei
    Sun, Jiawei
    Liu, Zhiyang
    Ang, Marcelo H., Jr.
    Lu, Wen Feng
    Tay, Francis E. H.
    2024 9TH INTERNATIONAL CONFERENCE ON CONTROL AND ROBOTICS ENGINEERING, ICCRE 2024, 2024, : 86 - 91
  • [8] Learning Grasping for Robot with Parallel Gripper from Human Demonstration via Contact Analysis
    Zhang, Zhengshen
    Liu, Chenchen
    Zhou, Lei
    Sun, Jiawei
    Liu, Zhiyang
    Ang, Marcelo H.
    Lu, Wen Feng
    Tay, Francis EH
    2024 9th International Conference on Control and Robotics Engineering, ICCRE 2024, 2024, : 86 - 91
  • [9] Learning Under-specified Object Manipulations From Human Demonstrations
    Qian, Kun
    Xu, Jun
    Gao, Ge
    Fang, Fang
    Ma, Xudong
    2018 15TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2018, : 1936 - 1941
  • [10] Everyday Objects Rearrangement in a Human-Like Manner via Robotic Imagination and Learning From Demonstration
    Mendez, Alberto
    Prados, Adrian
    Menendez, Elisabeth
    Barber, Ramon
    IEEE ACCESS, 2024, 12 : 92098 - 92119