Human Grasp Feature Learning and Object Recognition Based onMulti-sensor Information Fusion
ZHANG Yangyang1, HUANG Ying1,2, LIU Yue1, LIU Caixia1, LIU Ping1, ZHANG Yugang1
1. School of Electronic Science & Applied Physics, Hefei University of Technology, Hefei 230009, China; 2. The State Key Laboratory of Bioelectronics, Southeast University, Nanjing 210096, China
Abstract:Human grasping feature learning and object recognition are studied based on flexible wearable sensors and multi-modal information fusion, and the application of perceptual information to the human grasping process is explored. A data glove is built by utilizing 10 strain sensors, 14 temperature sensors and 78 pressure sensors, and is put on the human hand to measure the bending angle of the finger joints, as well as the temperature and pressure distribution information of the grasped object in human grasping behaviors. The cross-modal information representation is established on time and space sequences, and the multi-modal information is fused by deep convolution neural network to construct the learning model of human grasping feature and realize the accurate recognition of the grasped object. Relevant experiments and validity analysis are carried out for joint angle feature, temperature feature and pressure information feature respectively. The results show that the accurate recognition of 18 kinds of objects can be realized by multi-modal information fusion of multiple sensors.
[1] Liu P, Glas D F, Kanda T, et al. Data-driven HRI:Learning social behaviors by example from human-human interaction[J]. IEEE Transactions on Robotics, 2016, 32(4):988-1008. [2] Thepsoonthorn C, Ogawa K, Miyake Y. Does robot's human-based gaze and head nodding behavior really win over non-human-based behavior in human-robot interaction?[C]//Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. Piscataway, USA:IEEE, 2017:301-302. [3] Yao B, Zhou Z, Wang L, et al. Sensorless and adaptive admittance control of industrial robot in physical human-robot interaction[J]. Robotics and Computer-Integrated Manufacturing, 2018, 51:158-168. [4] Sato E, Yamaguchi T, Harashima F. Natural interface using pointing behavior for human-robot gestural interaction[J]. IEEE Transactions on Industrial Electronics, 2007, 54(2):1105-1112. [5] Zhang T, Jiang L, Liu H. A novel grasping force control strategy for multi-fingered prosthetic hand[J]. Journal of Central South University of Technology, 2012, 19(6):1537-1542. [6] Lisetti C L, Brown S M, Alvarez K, et al. A social informatics approach to human-robot interaction with a service social robot[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part C:Applications and Reviews, 2004, 34(2):195-209. [7] Li S, Zhang X. Implicit intention communication in human-robot interaction through visual behavior studies[J]. IEEE Transactions on Human-Machine Systems, 2017, 47(4):437-448. [8] Villafuerte Segura R, Dominguez Ramirez O A, Gonzalez Hernandez O, et al. A simple implementation of an intelligent adaptive control systems for human-robot interaction[J]. IEEE Latin America Transactions, 2016, 14(1):20-31. [9] Chen G, Bing Z, Röhrbein F, et al. Toward brain-inspired learning with the neuromorphic snake-like robot and the neurorobotic platform[J]. IEEE Transactions on Cognitive and Developmental Systems, 2017, 11(1):1-12. [10] Huang X, Zhang F, Li H, et al. An online technology for measuring icing shape on conductor based on vision and force sensors[J]. IEEE Transactions on Instrumentation and Measurement, 2017, 66(12):3180-3189. [11] Guo H Y, Pu X J, Chen J, et al. A highly sensitive, self-powered triboelectric auditory sensor for social robotics and hearing aids[J]. Science Robotics, 2018, 3(20):2516-2524. [12] Luo D, Hu F, Zhang T, et al. How does a robot develop its reaching ability like human infants do?[J]. IEEE Transactions on Cognitive and Developmental Systems, 2018, 10(3):795-809. [13] Kaboli M, Cheng G. Robust tactile descriptors for discriminating objects from textural properties via artificial robotic skin[J]. IEEE Transactions on Robotics, 2018, 34(4):985-1003. [14] Ponraj G, Kirthika S K, Thakor N V, et al. Development of flexible fabric based tactile sensor for closed loop control of soft robotic actuator[C]//IEEE Conference on Automation Science and Engineering. Piscataway, USA:IEEE, 2017:1451-1456. [15] Neumann P P, Kohlhoff H, Hüllmann D, et al. Bringing mobile robot olfaction to the next dimension-UAV-based remote sensing of gas clouds and source localization[C]/IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2017:3910-3916. [16] Fan H, Arain M A, Bennett V H, et al. Improving gas dispersal simulation for mobile robot olfaction:Using robot-created occupancy maps and remote gas sensors in the simulation loop[C]//2017 ISOCS/IEEE International Symposium on Olfaction and Electronic Nose. Piscataway, USA:IEEE, 2017. DOI:10.1109/ISOEN.2017.7968874. [17] Fontenelle L F, Lopes A P, Borges M C, et al. Auditory, visual, tactile, olfactory, and bodily hallucinations in patients with obsessive-compulsive disorder[J]. CNS Spectrums, 2008, 13(2):125-130. [18] Stojmenova K, Jakus G, Sodnik J. Sensitivity evaluation of the visual, tactile, and auditory detection response task method while driving[J]. Traffic injury prevention, 2017, 18(4):431-436. [19] Liu H, Yu Y, Sun F, et al. Visual-tactile fusion for object recognition[J]. IEEE Transactions on Automation Science and Engineering, 2016, 14(2):996-1008. [20] Ndengue J D, Cesini I, Faucheu J, et al. Tactile perception and friction-induced vibrations:Discrimination of similarly patterned wood-like surfaces[J]. IEEE Transactions on Haptics, 2016, 10(3):409-417. [21] Ebrahimzadeh A, Chowdhury M, Maier M. Human-agent-robot task coordination in FiWi-based tactile internet infrastructures using context-and self-awareness[J]. IEEE Transactions on Network and Service Management, 2019, 16(3):1127-1142. [22] Zhang T, Jiang L, Liu H. Design and functional evaluation of a dexterous myoelectric hand prosthesis with biomimetic tactile sensor[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2018, 26(7):1391-1399. [23] Sundaram S, Kellnhofer P, Li Y, et al. Learning the signatures of the human grasp using a scalable tactile glove[J]. Nature, 2019, 569(7758):698. [24] 李玉,赵翠莲,费森杰,等.基于ARAT与视触融合的E手套康复评估与训练系统[J].中国医疗器械杂志,2017,41(4):244-247.Li Y, Zhao C L, Fei S J, et al. E glove rehabilitation evaluation and training system based on ARAT and visual touch fusion[J]. Chinese Journal of Medical Instrumentation, 2017, 41(4):244-247. [25] Liu H, Qin J, Sun F C, et al. Extreme kernel sparse learning for tactile object recognition[J]. IEEE transactions on cybernetics, 2016, 47(12):4509-4520. [26] Kim J H, Thang N D, Kim T S. 3-D hand motion tracking and gesture recognition using a data glove[C]//IEEE International Symposium on Industrial Electronics. Piscataway, USA:IEEE, 2009:1013-1018. [27] Asif U, Bennamoun M, Sohel F A. RGB-D object recognition and grasp detection using hierarchical cascaded forests[J]. IEEE Transactions on Robotics, 2017, 33(3):547-564. [28] 张阳阳,黄英,郝超,等.基于织物拉伸传感器的手势映射系统[J].仪器仪表学报,2017,38(10):2422-2429. Zhang Y Y, Huang Y, Hao C, et al. Gestures mapping system based on fabric strain sensor[J]. Chinese Journal of Scientific Instrument, 2017, 38(10):2422-2429. [29] 张阳阳,黄英,刘家祥,等.面向手势动作捕捉的传感器设计及主从手运动映射[J].机器人,2019,41(2):156-164.Zhang Y Y, Huang Y, Liu J X, et al. Sensor design for gesture capturing and master-slave hand motion mapping[J]. Robot, 2019, 41(2):156-164. [30] Parashar A, Rhu M, Mukkara A, et al. SCNN:An accelerator for compressed-sparse convolutional neural networks[C]//ACM/IEEE 44th Annual International Symposium on Computer Architecture. Piscataway, USA:IEEE, 2017:27-40. [31] Liu H, Liu Y, Sun F C. Robust exemplar extraction using structured sparse coding[J]. IEEE Transactions on Neural Networks & Learning Systems, 2015, 26(8):1816-1821. [32] Zhang X, Nie L, Lan L, et al. Stacked marginal time warping for temporal alignment[J]. Neural Processing Letters, 2019, 49(2):711-735. [33] Guo X H, Huang Y, Zhao Y N, et al. Highly stretchable strain sensor based on SWCNTs/CB synergistic conductive network for wearable human-activity monitoring and recognition[J]. Smart Material and Structures, 2017, 26(9). DOI:10.1088/1361-665X/aa79c3.