Refining Grasp Affordance Models by Experience

被引:22
|
作者
Detry, Renaud [1 ]
Kraft, Dirk [2 ]
Buch, Anders Glent [2 ]
Krueger, Norbert [2 ]
Piater, Justus [1 ]
机构
[1] Univ Liege, B-4000 Liege, Belgium
[2] Univ Southern Denmark, Odense, Denmark
关键词
D O I
10.1109/ROBOT.2010.5509126
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a method for learning object grasp affordance models in 3D from experience, and demonstrate its applicability through extensive testing and evaluation on a realistic and largely autonomous platform. Grasp affordance refers here to relative object-gripper configurations that yield stable grasps. These affordances are represented probabilistically with grasp densities, which correspond to continuous density functions defined on the space of 6D gripper poses. A grasp density characterizes an object's grasp affordance; densities are linked to visual stimuli through registration with a visual model of the object they characterize. We explore a batch-oriented, experience-based learning paradigm where grasps sampled randomly from a density are performed, and an importance-sampling algorithm learns a refined density from the outcomes of these experiences. The first such learning cycle is bootstrapped with a grasp density formed from visual cues. We show that the robot effectively applies its experience by downweighting poor grasp solutions, which results in increased success rates at subsequent learning cycles. We also present success rates in a practical scenario where a robot needs to repeatedly grasp an object lying in an arbitrary pose, where each pose imposes a specific reaching constraint, and thus forces the robot to make use of the entire grasp density to select the most promising achievable grasp.
引用
收藏
页码:2287 / 2293
页数:7
相关论文
共 50 条
  • [1] Learning to Grasp Familiar Objects Based on Experience and Objects' Shape Affordance
    Liu, Chunfang
    Fang, Bin
    Sun, Fuchun
    Li, Xiaoli
    Huang, Wenbing
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2019, 49 (12): : 2710 - 2723
  • [2] Learning Grasp Affordance Densities
    Detry R.
    Kraft D.
    Kroemer O.
    Bodenhagen L.
    Peters J.
    Krüger N.
    Piater J.
    Paladyn, 2011, 2 (01): : 1 - 17
  • [3] Learning to Detect Visual Grasp Affordance
    Song, Hyun Oh
    Fritz, Mario
    Goehring, Daniel
    Darrell, Trevor
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2016, 13 (02) : 798 - 809
  • [4] Developing Intelligent Robots that Grasp Affordance
    Loeb, Gerald E.
    FRONTIERS IN ROBOTICS AND AI, 2022, 9
  • [5] Self-Assessment of Grasp Affordance Transfer
    Ardon, Paola
    Pairet, Eric
    Petillot, Yvan
    Petrick, Ronald P. A.
    Ramamoorthy, Subramanian
    Lohan, Katrin S.
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 9385 - 9392
  • [6] METAMORPHIC HAND BASED GRASP CONSTRAINT AND AFFORDANCE
    Wei, Guowu
    Sun, Jie
    Zhang, Xinsheng
    Pensky, Dirk
    Piater, Justus
    Dai, Jian S.
    INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2015, VOL 5B, 2016,
  • [7] The role of social affordance on object grasp action
    Hai, Min
    Wang, Yonghui
    Li, Jinyu
    Zou, Meng
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2023, 58 : 233 - 233
  • [8] Learning Grasp Affordance Reasoning Through Semantic Relations
    Ardon, Paola
    Pairet, Eric
    Petrick, Ronald P. A.
    Ramamoorthy, Subramanian
    Lohan, Katrin S.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (04) : 4571 - 4578
  • [9] Learning Object-specific Grasp Affordance Densities
    Detry, R.
    Baseski, E.
    Popovic, M.
    Touati, Y.
    Krueger, N.
    Kroemer, O.
    Peters, J.
    Piater, J.
    2009 IEEE 8TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING, 2009, : 92 - +
  • [10] Learning Generalizable Dexterous Manipulation from Human Grasp Affordance
    Wu, Yueh-Hua
    Wang, Jiashun
    Wang, Xiaolong
    CONFERENCE ON ROBOT LEARNING, VOL 205, 2022, 205 : 618 - 629