Domain Randomization and Generative Models for Robotic Grasping

被引:0
|
作者
Tobin, Josh [1 ,2 ]
Biewald, Lukas [3 ]
Duan, Rocky [4 ]
Andrychowicz, Marcin [1 ]
Handa, Ankur [5 ]
Kumar, Vikash [1 ,2 ]
McGrew, Bob [1 ]
Ray, Alex [1 ]
Schneider, Jonas [1 ]
Welinder, Peter [1 ]
Zaremba, Wojciech [1 ]
Abbeel, Pieter [2 ,4 ]
机构
[1] OpenAI, San Francisco, CA 94110 USA
[2] Univ Calif Berkeley, Berkeley, CA 94720 USA
[3] Weights & Biases Inc, San Francisco, CA USA
[4] Embodied Intelligence, Emeryville, CA USA
[5] NVIDIA, Santa Clara, CA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning-based robotic grasping has made significant progress thanks to algorithmic improvements and increased data availability. However, state-of-the-art models are often trained on as few as hundreds or thousands of unique object instances, and as a result generalization can be a challenge. In this work, we explore a novel data generation pipeline for training a deep neural network to perform grasp planning that applies the idea of domain randomization to object synthesis. We generate millions of unique, unrealistic procedurally generated objects, and train a deep neural network to perform grasp planning on these objects. Since the distribution of successful grasps for a given object can be highly multimodal, we propose an autoregressive grasp planning model that maps sensor inputs of a scene to a probability distribution over possible grasps. This model allows us to sample grasps efficiently at test time (or avoid sampling entirely). We evaluate our model architecture and data generation pipeline in simulation and the real world. We find we can achieve a >90% success rate on previously unseen realistic objects at test time in simulation despite having only been trained on random objects. We also demonstrate an 80% success rate on real-world grasp attempts despite having only been trained on random simulated objects.
引用
收藏
页码:3482 / 3489
页数:8
相关论文
共 50 条
  • [41] A review: Machine learning on robotic grasping
    Li, Youhao
    Lei, Qujiang
    Cheng, ChaoPeng
    Zhang, Gong
    Wang, Weijun
    Xu, Zheng
    ELEVENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2018), 2019, 11041
  • [42] SpectGRASP: Robotic Grasping by Spectral Correlation
    Adjigble, Maxime
    de Farias, Cristiana
    Stolkin, Rustam
    Marturi, Naresh
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 3987 - 3994
  • [43] A Robotic Model of Reaching and Grasping Development
    Savastano, Piero
    Nolfi, Stefano
    IEEE TRANSACTIONS ON AUTONOMOUS MENTAL DEVELOPMENT, 2013, 5 (04) : 326 - 336
  • [44] KINEMATICS AND FRICTION IN GRASPING BY ROBOTIC HANDS
    OHWOVORIOLE, EN
    JOURNAL OF MECHANISMS TRANSMISSIONS AND AUTOMATION IN DESIGN-TRANSACTIONS OF THE ASME, 1987, 109 (03): : 398 - 404
  • [45] Integration of Visual Cues for Robotic Grasping
    Bergstrom, Niklas
    Bohg, Jeannette
    Kragic, Danica
    COMPUTER VISION SYSTEMS, PROCEEDINGS, 2009, 5815 : 245 - 254
  • [46] BIG DATA SUPERCHARGES ROBOTIC GRASPING
    Brown, Alan S.
    MECHANICAL ENGINEERING, 2017, 139 (08) : 23 - 23
  • [47] An SVM learning approach to robotic grasping
    Pelossof, R
    Miller, A
    Allen, P
    Jebara, T
    2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 3512 - 3518
  • [48] An adaptive locating problem for robotic grasping
    Portman, V
    Slutski, L
    Edan, Y
    ROBOTICA, 2001, 19 (03) : 295 - 304
  • [49] Grasp it! A versatile simulator for robotic grasping
    Miller, AT
    Allen, PK
    IEEE ROBOTICS & AUTOMATION MAGAZINE, 2004, 11 (04) : 110 - 122
  • [50] A Robotic Grasping Method using ConvNets
    Ogas, Elio
    Avila, Luis
    Larregay, Guillermo
    Moran, Daniel
    2019 ARGENTINE CONFERENCE ON ELECTRONICS (CAE), 2019, : 21 - 26