面向机器人交互识别的抗干扰多模态柔性电子皮肤

Interference-resistant Multimodal Flexible Electronic Skin for Robotic Interactive Recognition

  • 摘要: 传统的阵列式机器人触觉感知系统存在界面柔性不够以及阵列密度与刷新频率相矛盾的问题,难以广泛应用。为了提高机器人的触觉感知效率,受人体皮肤结构的启发,提出了一种多模态柔性电子皮肤,并开展其在机器人物体识别及抗磁场干扰方面的研究。首次将混有磁粉的硅胶与分布式柔性触觉传感阵列集成为多模态力触觉感知层,在时间和空间域同步感知触觉信息和力触觉阵列信息,解决传统触觉感知系统时空弱配对的问题。构建了基于CNN-SVM-MLP(卷积神经网络-支持向量机-多层感知器)的多模态融合算法,实现对多模态触觉信息的高效融合,基于此构建机器人物体识别框架,实现交互物体的精准识别。对非磁性物体和磁性物体进行了交互实验及分析,结果表明提出的多模态柔性电子皮肤可以有效提高磁性触觉传感器的抗磁场干扰能力,实现24种物体的精准识别,且准确率达到99.42%。

     

    Abstract: abstract Traditional array-type robotic haptic sensing systems are difficult to be widely applied due to the problems of insufficient interface flexibility as well as the contradiction between array density and refresh frequency. In order to improve the haptic sensing efficiency of robots, a multimodal flexible electronic skin is proposed inspired by the structure of human skin, and its applications to robotic object recognition and anti-magnetic field interference is studied. For the first time, the silicone rubber mixed with the magnetic powder and the distributed flexible tactile sensing array are integrated into a multimodal force-tactile sensing layer, which senses the tactile information and the force-tactile array information synchronously in time and spaced domain, and solves the problem of weak spatio-temporal pairing of the traditional tactile sensing system. A multimodal fusion algorithm based on CNN-SVM-MLP (convolution neural network, support vector machine, multilayer perceptron) is constructed to achieve efficient fusion of multimodal haptic information, and an object recognition framework for robot is constructed based on the proposed algorithm to achieve accurate recognition of interactive objects. Interaction experiments and analyses of non-magnetic and magnetic objects are carried out, and the results show that the proposed multimodal flexible e-skin can effectively improve the magnetic haptic sensor's ability to resist magnetic field interference, and accurately recognize 24 types of objects with an accuracy of 99.42%.

     

/

返回文章
返回