YANG Tianhao, LI Yunkai, WANG Yaxin, MENG Qinghao. Touch Gesture Recognition for Service Robots[J]. ROBOT, 2022, 44(3): 310-320. DOI: 10.13973/j.cnki.robot.210045
Citation: YANG Tianhao, LI Yunkai, WANG Yaxin, MENG Qinghao. Touch Gesture Recognition for Service Robots[J]. ROBOT, 2022, 44(3): 310-320. DOI: 10.13973/j.cnki.robot.210045

Touch Gesture Recognition for Service Robots

More Information
  • Received Date: January 24, 2021
  • Revised Date: May 27, 2021
  • Accepted Date: May 25, 2021
  • Available Online: October 24, 2022
  • In order to realize tactile perception of robots in the process of human-robot interaction, a touch gesture recognition method for service robots is proposed. Firstly, the electronic skin is installed on the service robot, and an affective gesture dataset is built by collecting 10 kinds of touch gesture signals from 15 subjects. Then, a factorized spatio-temporal convolutional neural network ((2+1)D CNN) is used to classify the gestures applied to the service robot. The results show that the gesture recognition accuracies within-subject and across-subject are 90.25% and 83.44%, respectively. By changing the adjustment factors of spatio-temporal channels in the model, the model parameters can be greatly reduced while the recognition accuracy decreases slightly. Based on the touch gesture recognition experiment using electronic skin, it is preliminarily believed that the use of (2+1)D CNN can achieve human touch gesture recognition with higher accuracy and lower computational cost, which can realize the emotional interaction between service robots and humans through electronic skin.
  • [1]
    邓卫斌, 于国龙. 社交机器人发展现状及关键技术研究[J]. 科学技术与工程, 2016, 16(12): 163-170. doi: 10.3969/j.issn.1671-1815.2016.12.027

    Deng W B, Yu G L. Development and key technology of social robot[J]. Science Technology and Engineering, 2016, 16(12): 163-170. doi: 10.3969/j.issn.1671-1815.2016.12.027
    [2]
    Keshmiri S, Shiomi M, Sumioka H, et al. Gentle versus strong touch classification: Preliminary results, challenges, and potentials[J]. Sensors, 2020, 20(11). doi: 10.3390/s20113033
    [3]
    Hertenstein M J, Holmes R, McCullough M, et al. The communication of emotion via touch[J]. Emotion, 2009, 9(4): 566-573. doi: 10.1037/a0016108
    [4]
    彭玉青, 赵晓松, 陶慧芳, 等. 复杂背景下基于深度学习的手势识别[J]. 机器人, 2019, 41(4): 534-542. doi: 10.13973/j.cnki.robot.180568

    Peng Y Q, Zhao X S, Tao H F, et al. Hand gesture recognition against complex background based on deep learning[J]. Robot, 2019, 41(4): 534-542. doi: 10.13973/j.cnki.robot.180568
    [5]
    Li Y K, Wang B W, Li Y Y, et al. Design and output characteristics of magnetostrictive tactile sensor for detecting force and stiffness of manipulated objects[J]. IEEE Transactions on Industrial Informatics, 2019, 15(2): 1219-1225. doi: 10.1109/TII.2018.2862912
    [6]
    Šabanović S, Bennett C C, Chang W L, et al. PARO robot affects diverse interaction modalities in group sensory therapy for older adults with dementia[C]//IEEE 13th International Conference on Rehabilitation Robotics. Piscataway, USA: IEEE, 2013. doi: 10.1109/ICORR.2013.6650427
    [7]
    Saunderson S, Nejat G. How robots influence humans: A survey of nonverbal communication in social human-robot interaction[J]. International Journal of Social Robotics, 2019, 11(4): 575-608. doi: 10.1007/s12369-019-00523-0
    [8]
    Jung M M, Poel M, Poppe R, et al. Automatic recognition of touch gestures in the corpus of social touch[J]. Journal on Multimodal User Interfaces, 2017, 11(1): 81-96. doi: 10.1007/s12193-016-0232-9
    [9]
    Jung M M, Poppe R, Poel M, et al. Touching the void - Introducing CoST: corpus of social touch[C]//16th International Conference on Multimodal Interaction. New York, USA: ACM, 2014: 120-127. doi: 10.1145/2663204.2663242
    [10]
    van Wingerden S, Uebbing T J, Jung M M, et al. A neural network based approach to social touch classification[C]// 2014 Workshop on Emotion Representation and Modelling in Human-Computer-Interaction-Systems. New York, USA: ACM, 2014: 7-12. doi: 10.1145/2668056.2668060
    [11]
    Yohanan S, MacLean K E. The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature[J]. International Journal of Social Robotics, 2012, 4(2): 163-180. doi: 10.1007/s12369-011-0126-7
    [12]
    Hughes D, Farrow N, Profita H, et al. Detecting and identifying tactile gestures using deep autoencoders, geometric moments and gesture level features[C]//ACM International Conference on Multimodal Interaction. New York, USA: ACM, 2015: 415-422. doi: 10.1145/2818346.2830601
    [13]
    Ta V C, Johal W, Portaz M, et al. The Grenoble system for the social touch challenge at ICMI 2015[C]//ACM International Conference on Multimodal Interaction. New York, USA: ACM, 2015: 391-398. doi: 10.1145/2818346.2830598
    [14]
    Altuglu T B, Altun K. Recognizing touch gestures for social human-robot interaction[C]//ACM International Conference on Multimodal Interaction. New York, USA: ACM, 2015: 407-413. doi: 10.1145/2818346.2830600
    [15]
    Hughes D, Krauthammer A, Correll N. Recognizing social touch gestures using recurrent and convolutional neural networks[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2017: 2315-2321. doi: 10.1109/ICRA.2017.7989267
    [16]
    Albawi S, Bayat O, Al-Azawi S, et al. Social touch gesture recognition using convolutional neural network[J]. Computational Intelligence and Neuroscience, 2018. doi: 10.1155/2018/6973103
    [17]
    Altun K, MacLean K E. Recognizing affect in human touch of a robot[J]. Pattern Recognition Letters. 2015, 66(15): 31-40. doi: 10.1016/j.patrec.2014.10.016
    [18]
    Gamboa-Montero J J, Alonso-Martin F, Castillo J C, et al. Detecting, locating and recognising human touches in social robots with contact microphones[J]. Engineering Applications of Artificial Intelligence, 2020, 92. doi: 10.1016/j.engappai.2020.103670
    [19]
    许会超, 苗新刚, 汪苏. 基于FBG的机器人柔性触觉传感器[J]. 机器人, 2018, 40(5): 634-639, 722. doi: 10.13973/j.cnki.robot.180045

    Xu H C, Miao X G, Wang S. A flexible tactile sensor for robot based on FBG[J]. Robot, 2018, 40(5): 634-639, 722. doi: 10.13973/j.cnki.robot.180045
    [20]
    Silvera-Tawil D, Rye D, Velonaki M. Interpretation of social touch on an artificial arm covered with an EIT-based sensitive skin[J]. International Journal of Social Robotics, 2014, 6(4): 489-505. doi: 10.1007/s12369-013-0223-x
    [21]
    Someya T, Sekitani T, Iba S, et al. A large-area, flexible pressure sensor matrix with organic field-effect transistors for artificial skin applications[J]. PNAS, 2004, 101(27): 9966-9970. doi: 10.1073/pnas.0401918101
    [22]
    Wang X W, Gu Y, Xiong Z P, et al. Silk-molded flexible, ultrasensitive, and highly stable electronic skin for monitoring human physiological signals[J]. Advanced Materials, 2014, 26(9): 1336-1342. doi: 10.1002/adma.201304248
    [23]
    Zhao X, Zhang Z, Liao Q, et al. Self-powered user-interactive electronic skin for programmable touch operation platform[J]. Science Advances, 2020, 6(28). doi: 10.1126/sciadv.aba4294
    [24]
    Fang B, Sun F C, Liu H P, et al. Wearable technology for robotic manipulation and learning[M]. Singapore: Springer, 2020. doi: 10.1007/978-981-15-5124-6
    [25]
    Sundaram S, Kellnhofer P, Li Y, et al. Learning the signatures of the human grasp using a scalable tactile glove[J]. Nature, 2019, 569(7758): 698-702. doi: 10.1038/s41586-019-1234-z
    [26]
    Kale S, Mane S, Patil P. Wearable biomedical parameter monitoring system: A review[C]//International Conference of Electronics, Communication and Aerospace Technology. Piscataway, USA: IEEE, 2017: 614-617. doi: 10.1109/ICECA.2017.8203611
    [27]
    Flagg A, MacLean K. Affective touch gesture recognition for a furry zoomorphic machine[C]//7th International Conference on Tangible, Embedded and Embodied Interaction. New York, USA: ACM, 2013: 25-32. doi: 10.1145/2460625.2460629
    [28]
    Russell J A, Weiss A, Mendelsohn G A. Affect grid: A single-item scale of pleasure and arousal[J]. Journal of Personality and Social Psychology, 1989, 57: 493-502. doi: 10.1037/0022-3514.57.3.493
    [29]
    Ekman P. An argument for basic emotions[J]. Cognition and Emotion, 1992, 6(3-4): 169-200. doi: 10.1080/02699939208411068
    [30]
    Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM, 2017, 60(6): 84-90. doi: 10.1145/3065386
    [31]
    Tran D, Wang H, Torresani L, et al. A closer look at spatio-temporal convolutions for action recognition[C]//IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway, USA: IEEE, 2018: 6450-6459. doi: 10.1109/CVPR.2018.00675
    [32]
    Glorot X, Bordes A, Bengio Y. Deep sparse rectifier neural networks[C]//14th International Conference on Artificial Intelligence and Statistics. 2011: 315-323.
    [33]
    Srivastava N, Hinton G E, Krizhevsky A, et al. Dropout: A simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
    [34]
    Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift[C]//32nd International Conference on Machine Learning. New York, USA: ACM, 2015: 448-456.
    [35]
    Krauledat M, Tangermann M, Blankertz B, et al. Towards zero training for brain-computer interfacing[J]. PLoS One, 2008, 3(8). doi: 10.1371/journal.pone.0002967
    [36]
    Li X, Song D, Zhang P, et al. Exploring EEG features in cross-subject emotion recognition[J]. Frontiers in Neuroscience, 2018, 12. doi: 10.3389/fnins.2018.00162
    [37]
    Jones S E, Brown B C. Touch attitudes and behaviors, recollections of early childhood touch, and social self-confidence[J]. Journal of Nonverbal Behavior, 1996, 20(3): 147-163. doi: 10.1007/BF02281953
    [38]
    Deethardt J F, Hines D G. Tactile communication and personality differences[J]. Journal of Nonverbal Behavior, 1983, 8(2): 143-156. doi: 10.1007/BF00987000
  • Cited by

    Periodical cited type(5)

    1. 邹灵果,张美花. 基于数理统计特征的人机交互图像手势识别. 黑龙江工业学院学报(综合版). 2024(01): 97-104 .
    2. 郑奕捷,李翠玉,郑祖芳. RNN循环神经网络的服务机器人交互手势辨识. 机械设计与制造. 2024(04): 282-285 .
    3. 汪骏,汪锦民,闫胜昝. 基于层次分析法的智能服务机器人设计研究. 工业设计. 2023(07): 152-155 .
    4. 耿源钘. 基于人工智能与5G技术的服务机器人应用. 长江信息通信. 2022(01): 235-237 .
    5. 李国玄,马凯凯,王文博. 基于HOG特征提取和SVM的手势识别方法研究. 传感器世界. 2022(12): 30-36 .

    Other cited types(9)

Catalog

    Article views (148) PDF downloads (162) Cited by(14)
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return