Controlling an Underwater Manipulator via Event-Related Potentials of Brainwaves
YU Jiancheng1, ZHANG Jin1,2, LI Wei1
1. State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China;
2. University of Chinese Academy of Sciences, Beijing 100049, China
Abstract:To free two hands of the underwater manipulator operator, the BCI (brain-computer interface) technology is applied to underwater operational tasks in this paper, where the manipulator follows instructions transformed from the brainwaves. As the current brainwave-based manipulator control methods can't satisfy requirements for the actual underwater tasks in terms of real-time performance and accuracy, a control strategy is proposed for controlling the underwater manipulator via event-related potentials (ERP) of brainwaves (evoked by visual stimuli). The operator can quickly complete a given task after optimizing the visual evoked ERP interface and combining the characteristics of the brainwaves control and the underwater manipulator operation. Eight subjects are invited to conduct experiments on a self-developed experimental platform. The average accuracy of identifying an operator's intention, the system information transfer rate and the average control time of completing the task are 91.5%, 27.7 bits/min and 90.1 s, respectively. Comparing with the similar systems, the system performance of the proposed control strategy is better, and the operational efficiency satisfies the practical operational requirements.
[1] Zhang J, Li W, Yu J, et al. Virtual platform of a manned submersible vehicle carrying an underwater manipulator[C]//Cyber Technology in Automation, Control, and Intelligent Systems. Piscataway, USA:IEEE, 2015:1126-1131.
[2] Salgado-Jimenez T, Gonzalez-Lopez J L, Martinez-Soto L F, et al. Deep water ROV design for the Mexican oil industry[C]//OCEANS. Piscataway, USA:IEEE, 2010:6pp.
[3] Posseme G, Labiau C, Kervern G, et al. Method and system for destroying submerged objects, in particular submerged mines:USA, 5844159[P]. 1998-12-01.
[4] 晏勇,马培荪,王道炎,等.深海ROV及其作业系统综述[J].机器人,2005,27(1):82-89.Yan Y, Ma P S, Wang D Y, et al. Development of deep sea ROV and its working system[J]. Robot, 2005, 27(1):82-89.
[5] Yao J J, Wang L Q, Jia P, et al. Development of a 7-function hydraulic underwater manipulator system[C]//IEEE International Conference on Mechatronics and Automation. Piscataway, USA:IEEE, 2009:1202-1206.
[6] Jun B H, Lee P M, Kim S. Manipulability analysis of underwater robotic arms on ROV and application to task-oriented joint configuration[J]. Journal of Mechanical Science and Technology, 2008, 22(5):887-894.
[7] Zhang Q F, Chen J, Huo L Q, et al. Design and experiments of a deep-sea hydraulic manipulator system[C]//OCEANS. Piscataway, USA:IEEE, 2013:5pp.
[8] Marani G, Choi S K, Yuh J. Underwater autonomous manipulation for intervention missions AUVs[J]. Ocean Engineering, 2009, 36(1):15-23.
[9] Marani G, Choi S K. Underwater target localization[J]. IEEE Robotics & Automation Magazine, 2010, 17(1):64-70.
[10] Palankar M, De Laurentis K J, Alqasemi R, et al. Control of a 9-DoF wheelchair-mounted robotic arm system using a P300 brain computer interface:Initial experiments[C]//IEEE International Conference on Robotics and Biomimetics. Piscataway, USA:IEEE, 2008:348-353.
[11] Waytowich N, Henderson A, Krusienski D, et al. Robot application of a brain computer interface to Staubli TX40 robots-Early stages[C]//World Automation Congress. Piscataway, USA:IEEE, 2010:1-6.
[12] Shen H, Zhao L, Bian Y, et al. Research on SSVEP-based controlling system of multi-DOFs manipulator[M]//Lecture Notes in Computer Science book series, vol.5553. Berlin, Germany:Springer-Verlag, 2009:171-177.
[13] Bakardjian H, Tanaka T, Cichocki A. Brain control of robotic arm using affective steady-state visual evoked potentials[C/OL]//Proceedings of the 5th IASTED International Conference Human-Computer Interaction. 2010:264-270. (2016-03-20).http://xueshu.baidu.com/s?wd=paperuri:(965aa31bdbb6b9a34626c57d3de4dcfe)&filter=sc_long_sign&sc_ks_para=q%3DBrain+control+of+robotic+arm+using+affective+steady-state+visual+evoked+potentials&tn=SE_baiduxueshu_c1gjeupa&ie=utf-8&sc_us=8653085040470030903.
[14] Grigorescu S M, Lüth T, Fragkopoulos C, et al. A BCI-controlled robotic assistant for quadriplegic people in domestic and professional life[J]. Robotica, 2012, 30(3):419-431.
[15] 魏景汉,罗跃嘉.事件相关电位原理与技术[M].北京:科学出版社,2010.Wei J H, Luo Y J. Principle and technique of event-related brain potentials[M]. Beijing:Science Press, 2010.
[16] Sutton S, Braren M, Zubin J, et al. Evoked potential correlates of stimulus uncertainty[J]. Science, 1965, 150(3700):1187-1188.
[17] Karaka? S, Erzengin O U, Ba?ar E. The genesis of human event-related responses explained through the theory of oscillatory neural assemblies[J]. Neuroscience Letters, 2000, 285(1):45-48.
[18] Patel S H, Azzam P N. Characterization of N200 and P300:Selected studies of the event-related potential[J]. International Journal of Medical Sciences, 2005, 2(4):147-154.
[19] Polich J. Updating P300:An integrative theory of P3a and P3b[J]. Clinical Neurophysiology, 2007, 118(10):2128-2148.
[20] Hajihosseini A, Holroyd C B. Frontal midline theta and N200 amplitude reflect complementary information about expectancy and outcome evaluation[J]. Psychophysiology, 2013, 50(6):550-562.
[21] Guger C, Daban S E. How many people are able to control a P300-based brain-computer interface (BCI)[J]. Neuroscience Letters, 2009, 462(1):94-98.
[22] Johnson G D, Waytowich N R, Cox D J, et al. Extending the discrete selection capabilities of the P300 speller to goal-oriented robotic arm control[C]//3rd IEEE, RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics. Piscataway, USA:IEEE, 2010:572-575.
[23] Li M F, Li W, Zhou H H. Increasing N200 potentials via visual stimulus depicting humanoid robot behavior[J]. International Journal of Neural Systems, 2016, 26(1):1-16.
[24] Manyakov N V, Chumerin N, Combaz A, et al. Comparison of classification methods for P300 brain-computer interface on disabled subjects[J]. Computational Intelligence & Neuroscience, 2011:doi:10.1155/2011/519868.
[25] 张学工.模式识别[M].3版.北京:清华大学出版社,2010:60-66.Zhang X G. Pattern recognition[M]. 3rd ed. Beijing:Tsinghua University Press, 2010:60-66.
[26] Liu T, Goldberg L, Gao S, et al. An online brain-computer interface using non-flashing visual evoked potentials[J]. Journal of Neural Engineering, 2010, 7(3):036003-036011.
[27] Webots. Commercial mobile robot simulation software[EB/OL].[2017-01-05]. http://www.cyberbotics.com/webots.php.
[28] Zhang J, Li W, Yu J, et al. Development of a virtual platform for telepresence control of an underwater manipulator mounted on a submersible vehicle[J]. IEEE Transactions on Industrial Electronics, 2017, 64(2):1716-1727.
[29] Zhang J, Li W, Yu J, et al. Virtual platform of a remotely operated vehicle[C]//OCEANS. Piscataway, USA:IEEE, 2015:5pp.
[30] Jin J, Allison B Z, Wang X, et al. A combined brain-computer interface based on P300 potentials and motion-onset visual evoked potentials[J]. Journal of Neuroscience Methods, 2012, 205(2):265-276.
[31] 李梦凡.基于事件相关电位的脑-机器人交互系统与认知负荷的解析[D].天津:天津大学,2016.Li M F. Event-related potential-based brain-robot interactive system and cognitive load analysis[D]. Tianjin:Tianjin University, 2016.
[32] 王建荣,张句,路文焕,等.机器人自身噪声环境下的自动语音识别[J].清华大学学报:自然科学版,2017,57(2):153-157.Wang J R, Zhang J, Lu W H, et al. Automatic speech recognition with robot noise[J]. Journal of Tsinghua University:Science and Technology, 2017, 57(2):153-157.