Abstract:Based on deep neural network model, a grasping gesture optimization method is proposed for multi-fingered dexterous hands. Firstly, a grasp dataset is constructed in simulation environment, and then a convolutional neural network is trained on this basis to predict the grasp quality function from the monocular visual information of the target object and the grasp configuration of the multi-fingered dexterous hand. Therefore, the grasp planning problem of the multi-fingered dexterous hands is transformed into an optimization problem about maximizing the grasping quality. Further the backpropagation and gradient ascent algorithm in deep learning is used to iterate and optimize the grasping gestures of the multi-fingered dexterous hands. In simulation, the evaluation results of the grasping quality, separately computed by the proposed network and the simulation platform for the same grasp configuration, are compared. Then the proposed method is implemented to optimize the initial gestures searched randomly, and the force closure metrics of the gestures before and after optimization are compared. Finally, the optimization performance of the proposed method is validated on the actual robot platform. The results show that the grasping success rate of the proposed method for the unknown objects is more than 80%, and for the failed grasps, the success rate after optimization reaches 90%.
[1] Bohg J, Morales A, Asfour T, et al. Data-driven grasp synthesis-A survey[J]. IEEE Transactions on Robotics, 2014, 30(2):289-309. [2] Kirkpatrick D G, Mishra B, Yap C K. Quantitative Steinitz's theorems with applications to multifingered grasping[C]//TwentySecond Annual ACM Symposium on Theory of Computing. New York, USA:ACM, 1990:341-351. [3] Ferrari C, Canny J. Planning optimal grasps[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 1992:2290-2295. [4] Ciocarlie M T, Allen P K. Hand posture subspaces for dexterous robotic grasping[J]. International Journal of Robotics Research, 2009, 28(7):851-867. [5] Sung J, Jin S H, Saxena A. Robobarista:Object part based transfer of manipulation trajectories from crowd-sourcing in 3D pointclouds[C]//International Symposium on Robotics Research. Cham, Switzerland:Springer, 2015:701-720. [6] Romero J. Human-to-robot mapping of grasps[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems, WS on Grasp and Task Learning by Imitation. 2008:9-15. [7] Levine S, Pastor P, Krizhevsky A, et al. Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection[J]. International Journal of Robotics Research, 2018, 37(4-5):421-436. [8] Goldfeder C, Ciocarlie M, Peretzman J, et al. Data-driven grasping with partial sensor data[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2009:1278-1283. [9] Lowe D G. Object recognition from local scale-invariant features[C]//Seventh IEEE International Conference on Computer Vision. Piscataway, USA:IEEE, 1999:1150-1157. [10] Goldfeder C, Ciocarlie M, Dang H, et al. The Columbia grasp database[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2009:1710-1716. [11] Jiang Y, Moseson S, Saxena A. Efficient grasping from RGBD images:Learning using a new rectangle representation[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2011:3304-3311. [12] Lenz I, Lee H, Saxena A. Deep learning for detecting robotic grasps[J]. International Journal of Robotics Research, 2015, 34(4-5):705-724. [13] Redmon J, Angelova A. Real-time grasp detection using convolutional neural networks[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:1316-1322. [14] Guo D, Sun F C, Liu H P, et al. A hybrid deep architecture for robotic grasp detection[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2017:1609-1614. [15] 夏晶,钱堃,马旭东,等.基于级联卷积神经网络的机器人平面抓取位姿快速检测[J].机器人, 2018, 40(6):794-802. Xia J, Qian K, Ma X D, et al. Fast planar grasp pose detection for robot based on cascaded deep convolutional neural networks[J]. Robot, 2018, 40(6):794-802. [16] 喻群超,尚伟伟,张驰.基于三级卷积神经网络的物体抓取检测[J].机器人, 2018, 40(5):762-768. Yu Q C, Shang W W, Zhang C. Object grab detecting based on three-level convolution neural network[J]. Robot, 2018, 40(5):762-768. [17] Zhou Y L, Kris H. 6DOF grasp planning by optimizing a deep learning scoring function[C]//Robotics:Science and Systems (RSS) Workshop on Revisiting Contact Turning a Problem into a Solution. 2017. [18] Varley J, Weisz J, Weiss J, et al. Generating multi-fingered robotic grasps via deep learning[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2015:4415-4420. [19] Kappler D, Bohg J, Schaal S. Leveraging big data for grasp planning[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:4304-4311. [20] Song F J, Zhao Z Z, Ge W, et al. Learning optimal grasping posture of multi-fingered dexterous hands for unknown objects[C]//IEEE International Conference on Robotics and Biomimetics. Piscataway, USA:IEEE, 2018:2310-2315. [21] Zhao Z Z, Shang W W, He H Y, et al. Grasp prediction and evaluation of multi-fingered dexterous hands using deep learning[J]. Robotics and Autonomous Systems, 2020, 129. DOI:10.1016/j.robot.2020.103550. [22] Lu Q K, Chenna K, Sundaralingam B, et al. Planning multifingered grasps as probabilistic inference in a learned deep network[M]//Springer Proceedings in Advanced Robotics, Vol.10. Cham, Switzerland:Springer, 2020:455-472. [23] Miller A T, Allen P K. GraspIt! A versatile simulator for robotic grasping[J]. IEEE Robotics & Automation Magazine, 2004, 11(4):110-122. [24] Cornell University. Cornell grasping dataset[DB/OL]. (2015-11-31)[2016-09-01]. http://pr.cs.cornell.edu/grasping/rectdata/data.php. [25] Mahler J, Pokorny F T, Hou B, et al. Dex-Net 1.0:A cloudbased network of 3D objects for robust grasp planning using a multi-armed bandit model with correlated rewards[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2016:1957-1964.