Grasping Gesture Optimization of Multi-fingered Dexterous Hands Based on Deep Neural Networks

被引:0
|
作者
He H. [1 ]
Shang W. [1 ]
Zhang F. [1 ]
Cong S. [1 ]
机构
[1] Department of Automation, University of Science and Technology of China, Hefei
来源
Jiqiren/Robot | 2023年 / 45卷 / 01期
关键词
convolutional neural network; grasp planning; multi-fingered dexterous hand; unknown object;
D O I
10.13973/j.cnki.robot.210376
中图分类号
学科分类号
摘要
Based on deep neural network model, a grasping gesture optimization method is proposed for multi-fingered dexterous hands. Firstly, a grasp dataset is constructed in simulation environment, and then a convolutional neural network is trained on this basis to predict the grasp quality function from the monocular visual information of the target object and the grasp configuration of the multi-fingered dexterous hand. Therefore, the grasp planning problem of the multi-fingered dexterous hands is transformed into an optimization problem about maximizing the grasping quality. Further the backpropagation and gradient ascent algorithm in deep learning is used to iterate and optimize the grasping gestures of the multi-fingered dexterous hands. In simulation, the evaluation results of the grasping quality, separately computed by the proposed network and the simulation platform for the same grasp configuration, are compared. Then the proposed method is implemented to optimize the initial gestures searched randomly, and the force closure metrics of the gestures before and after optimization are compared. Finally, the optimization performance of the proposed method is validated on the actual robot platform. The results show that the grasping success rate of the proposed method for the unknown objects is more than 80%, and for the failed grasps, the success rate after optimization reaches 90%. © 2023 Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:38 / 47
页数:9
相关论文
共 25 条
  • [1] Bohg J, Morales A, Asfour T, Et al., Data-driven grasp synthesis – A survey, IEEE Transactions on Robotics, 30, 2, pp. 289-309, (2014)
  • [2] Kirkpatrick D G, Mishra B, Yap C K., Quantitative Steinitz’s theorems with applications to multifingered grasping, Twenty-Second Annual ACM Symposium on Theory of Computing, pp. 341-351, (1990)
  • [3] Ferrari C, Canny J., Planning optimal grasps, IEEE International Conference on Robotics and Automation, pp. 2290-2295, (1992)
  • [4] Ciocarlie M T, Allen P K., Hand posture subspaces for dexterous robotic grasping, International Journal of Robotics Research, 28, 7, pp. 851-867, (2009)
  • [5] Sung J, Jin S H, Saxena A., Robobarista: Object part based transfer of manipulation trajectories from crowd-sourcing in 3D pointclouds, International Symposium on Robotics Research, pp. 701-720, (2015)
  • [6] Romero J., Human-to-robot mapping of grasps, IEEE/RSJ International Conference on Intelligent Robots and Systems, WS on Grasp and Task Learning by Imitation, pp. 9-15, (2008)
  • [7] Levine S, Pastor P, Krizhevsky A, Et al., Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection, International Journal of Robotics Research, 37, 4-5, pp. 421-436, (2018)
  • [8] Goldfeder C, Ciocarlie M, Peretzman J, Et al., Data-driven grasping with partial sensor data, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1278-1283, (2009)
  • [9] Lowe D G., Object recognition from local scale-invariant features, Seventh IEEE International Conference on Computer Vision, pp. 1150-1157, (1999)
  • [10] Goldfeder C, Ciocarlie M, Dang H, Et al., The Columbia grasp database, IEEE International Conference on Robotics and Automation, pp. 1710-1716, (2009)