Abstract:Considering the problem that the robot needs to estimate the grasp quality in real time to dynamically adjust the grasp configuration in grasping process, a stable robotic grasp method based on prior tactile knowledge learning is proposed. Firstly, a tactile information based method is put forward for evaluating grasp quality according to the capacity of resisting the external perturbations in grasping process. Based on it, a visual-tactile joint dampest is built, and the prior tactile knowledge is learned. Secondly, an architecture is proposed to generate stable grasp configurations by fusing the visual image and the prior tactile knowledge. Finally, an experimental verification with 10 kinds of objects is carried out on the proposed robotic grasp system. The results show that the grasp stability is improved by 55% with the proposed method over traditional vision-based methods, and the success rates of stable grasp are 86% and 79% respectively on known and unknown objects, demonstrating good generalization performance of the proposed method.
[1] 李宇飞,高朝辉,申麟. 基于视觉的机械臂空间目标抓取策略研究[J].中国科学:技术科学, 2015, 45(1):31-35. Li Y F, Gao Z H, Shen L. Study of vision-based space target capturing strategy for manipulators[J]. Scientia Sinica:Technologica, 2015, 45(1):31-35. [2] Saxena A, Driemeyer J, Ng A Y. Robotic grasping of novel objects using vision[J]. International Journal of Robotics Research, 2008, 27(2):157-173. [3] Calandra R, Owens A, Upadhyaya M, et al. The feeling of success:Does touch sensing help predict grasp outcomes[DB/OL]. (2017-10-16)[2020-02-01]. https://arxiv.org/abs/1710.05512. [4] Dang H, Allen P K. Stable grasping under pose uncertainty using tactile feedback[J]. Autonomous Robots, 2014, 36(4):309330. [5] Hogan F R, Bauza M, Canal O, et al. Tactile regrasp:Grasp adjustments via simulated tactile transformations[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2018:2963-2970. [6] Howe R D, Cutkosky M R. Sensing skin acceleration for slip and texture perception[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 1989:145150. [7] Melchiorri C. Slip detection and control using tactile and force sensors[J]. IEEE/ASME Transactions on Mechatronics, 2000, 5(3):235-243. [8] Su Z, Hausman K, Chebotar Y, et al. Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor[C]//IEEE-RAS International Conference on Humanoid Robots. Piscataway, USA:IEEE, 2015:297-303. [9] Chebotar Y, Hausman K, Su Z, et al. Self-supervised regrasping using spatio-temporal tactile features and reinforcement learning[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2016:19601966. [10] James J W, Pestell N, Lepora N F. Slip detection with a biomimetic tactile sensor[J]. IEEE Robotics and Automation Letters, 2018, 3(4):3340-3346. [11] Yuan W, Li R, Srinivasan M A, et al. Measurement of shear and slip with a GelSight tactile sensor[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:304-311. [12] Hyttinen E, Kragic D, Detry R. Estimating tactile data for adaptive grasping of novel objects[C]//IEEE-RAS International Conference on Humanoid Robotics. Piscataway, USA:IEEE, 2017:643-648. [13] Guo D, Sun F C, Liu H P, et al. A hybrid deep architecture for robotic grasp detection[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2017:1609-1614. [14] Goodale M A. Vision and action:The control of grasping[M]. Bristol, UK:Intellect Books, 1990. [15] Chebotar Y, Hausman K, Su Z, et al. BiGS:BioTac grasp stability dataset[C]//ICRA 2016 Workshop on Grasping and Manipulation Datasets. 2016. [16] Luo S, Yuan W, Adelson E, et al. ViTac:Feature sharing between vision and tactile sensing for cloth texture recognition[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2018:2722-2727. [17] Wang T, Yang C, Kirchner F, et al. Multimodal grasp data set:A novel visual-tactile data set for robotic manipulation[J]. International Journal of Advanced Robotic Systems, 2019, 16(1). DOI:10.1177/1729881418821571. [18] He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition[C]//IEEE Conference on Computer Vision and Pattern Recognition. Piscataway, USA:IEEE, 2016:770-778. [19] Zeng A, Song S, Yu K T, et al. Robotic pick-and-place of novel objects in clutter with multi-affordance grasping and cross-domain image matching[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2018:3750-3757. [20] Jiang Y, Moseson S, Saxena A. Efficient grasping from RGBD images:Learning using a new rectangle representation[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2011:3304-3311.