RGB-D Sensor Based Human Comfortable Following Behavior for Service Robots in Indoor Environments
SUN Yue1,2, LIU Jingtai1,2
1. Institute of Robotics and Automatic Information System, Nankai University, Tianjin 300350, China;
2. Tianjin Key Laboratory of Intelligent Robotics, Tianjin 300350, China
孙月, 刘景泰. 基于RGB-D传感器的室内服务机器人舒适跟随方法[J]. 机器人, 2019, 41(6): 823-833.DOI: 10.13973/j.cnki.robot.180717.
SUN Yue, LIU Jingtai. RGB-D Sensor Based Human Comfortable Following Behavior for Service Robots in Indoor Environments. ROBOT, 2019, 41(6): 823-833. DOI: 10.13973/j.cnki.robot.180717.
Abstract:Combining the scientific research related to human behavior and social interaction, a human following method is proposed in consideration of human comfort feelings for service robots to achieve a more friendly and effective following behavior with a high degree of acceptance. With high-reliability and low-cost RGB-D (RGB-depth) sensor, a people detection algorithm based on HOG (histogram of oriented gradient) and a people tracking algorithm based on UKF (unscented Kalman filter) are studied, and real-time and accurate detection and tracking of human are realized in dynamic unstructured environment. On the basis of discussing the factors that affect the following comfort, a comfortable following model and a behavior utility function are established, and the motion control and planning based on human comfort are realized. Systematic analysis and evaluation on the proposed human comfort following method are carried out, which verifies the validity and usability of the following behavior, and further provides the possibility of friendly interaction between the human and the robot.
[1] Gross H M, Mueller S, Schroeter C, et al. Robot companion for domestic health assistance:Implementation, test and case study under everyday conditions in private apartments[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2015:5992-5999.
[2] Eisenbach M, Vorndran A, Sorge S, et al. User recognition for guiding and following people with a mobile robot in a clinical environment[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2015:3600-3607.
[3] Tasaki R, Kitazaki M, Miura J, et al. Prototype design of medical round supporting robot "Terapio"[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:829-834.
[4] Leigh A, Pineau J, Olmedo N, et al. Person tracking and following with 2D laser scanners[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:726-733.
[5] Leite I, Martinho C, Paiva A. Social robots for long-term interaction:A survey[J]. International Journal of Social Robotics, 2013, 5(2):291-308.
[6] Sim D Y Y, Loo C K. Extensive assessment and evaluation methodologies on assistive social robots for modelling humanrobot interaction-A review[J]. Information Sciences, 2015, 301:305-344.
[7] de Graaf M M A, Ben Allouch S, Klamer T. Sharing a life with Harvey:Exploring the acceptance of and relationship-building with a social robot[J]. Computers in Human Behavior, 2015, 43:1-14.
[8] de Graaf M M A, Ben Allouch S. Exploring influencing variables for the acceptance of social robots[J]. Robotics and Autonomous Systems, 2013, 61(12):1476-1486.
[9] Sekmen A, Challa P. Assessment of adaptive human-robot interactions[J]. Knowledge-Based Systems, 2013, 42:49-59.
[10] Kim Y, Mutlu B. How social distance shapes human-robot interaction[J]. International Journal of Human-Computer Studies, 2014, 72(12):783-795.
[11] Vemula A, Muelling K, Oh J. Modeling cooperative navigation in dense human crowds[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2017:1685-1692.
[12] Vemula A, Muelling K, Oh J. Social attention:Modeling attention in human crowds[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2018:4601-4607.
[13] Okal B, Arras K O. Learning socially normative robot navigation behaviors with Bayesian inverse reinforcement learning[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2016:2889-2895.
[14] Truong X T, Ngo T D. Toward socially aware robot navigation in dynamic and crowded environments:A proactive social motion model[J]. IEEE Transactions on Automation Science and Engineering, 2017, 14(4):1743-1760.
[15] Morales Y, Kallakuri N, Shinozawa K, et al. Human-comfortable navigation for an autonomous robotic wheelchair[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2013:2737-2743.
[16] Morales Y, Watanabe A, Ferreri F, et al. Including human factors for planning comfortable paths[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:6153-6159.
[17] Morales Y, Abdur-Rahim J, Watanabe A, et al. Analysis of navigational habituation[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2017:3056-3062.
[18] Kruse T, Pandey A K, Alami R, et al. Human-aware robot navigation:A survey[J]. Robotics and Autonomous Systems, 2013, 61(12):1726-1743.
[19] Spinello L, Arras K O. People detection in RGB-D data[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2011:3838-3843.
[20] Luber M, Spinello L, Arras K O. People tracking in RGB-D data with on-line boosted target models[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2011:3844-3849.
[21] Munaro M, Basso F, Menegatti E. Tracking people within groups with RGB-D data[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2012:2101-2107.
[22] Munaro M, Menegatti E. Fast RGB-D people tracking for service robots[J]. Autonomous Robots, 2014, 37(3):227-242.
[23] Jafari O H, Mitzel D, Leibe B. Real-time RGB-D based people detection and tracking for mobile robots and head-worn cameras[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2014:5636-5643.
[24] Gritti A P, Tarabini O, Guzzi J, et al. Kinect-based people detection and tracking from small-footprint ground robots[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2014:4096-4103.
[25] Linder T, Breuers S, Leibe B, et al. On multi-modal people tracking from mobile platforms in very crowded and dynamic environments[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2016:5512-5519.
[26] Wang M M, Su D, Shi L, et al. Real-time 3D human tracking for mobile robots with multisensors[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2017:5081-5087.
[27] Luber M, Stork J A, Tipaldi G D, et al. People tracking with human motion predictions from social forces[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2010:464-469.
[28] Luber M, Tipaldi G D, Arras K O. Better models for people tracking[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2011:854-859.
[29] Hirose N, Tajima R, Sukigara K. Personal robot assisting transportation to support active human life-Human-following method based on model predictive control for adjacency without collision[C]//IEEE International Conference on Mechatronics. Piscataway, USA:IEEE, 2015:76-81.
[30] Hirose N, Tajima R, Sukigara K. Personal robot assisting transportation to support active human life-Following control based on model predictive control with multiple future predictions[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2015:5395-5402.
[31] Granata C, Bidaud P. A framework for the design of person following behaviors for social mobile robots[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2012:4652-4659.
[32] Cosgun A, Florencio D A, Christensen H I. Autonomous person following for telepresence robots[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2013:4335-4342.
[33] Park J J, Kuipers B. Autonomous person pacing and following with model predictive equilibrium point control[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2013:1060-1067.
[34] Morales Y, Kanda T, Hagita N. Walking together:Side-byside walking model for an interacting robot[J]. ACM Transactions on Human-Robot Interaction, 2014, 3(2):50-73. DOI:10.5898/JHRI.3.2.Morales.
[35] Sung Y, Chung W. Hierarchical sample-based joint probabilistic data association filter for following human legs using a mobile robot in a cluttered environment[J]. IEEE Transactions on Human-Machine Systems, 2016, 46(3):340-349.
[36] Gupta M, Kumar S, Behera L, et al. A novel vision-based tracking algorithm for a human-following mobile robot[J]. IEEE Transactions on Systems, Man, and Cybernetics:Systems, 2017, 47(7):1415-1427.
[37] Chi W, Wang J, Meng Q H. A gait recognition method for human following in service robots[J]. IEEE Transactions on Systems, Man, and Cybernetics:Systems, 2018, 48(9):1429-1440.
[38] Herrera D, Roberti F, Toibero M, et al. Human-robot interaction:Legible behavior rules in passing and crossing events[J]. IEEE Latin America Transactions, 2016, 14(6):2644-2650.
[39] Herrera D, Roberti F, Toibero M, et al. Human interaction dynamics for its use in mobile robotics:Impedance control for leader-follower formation[J]. IEEE/CAA Journal of Automatica Sinica, 2017, 4(4):696-703.
[40] Sun Y, Sun L, Liu J T. Real-time and fast RGB-D based people detection and tracking for service robots[C]//12th World Congress on Intelligent Control and Automation. Piscataway, USA:IEEE, 2016:1514-1519.
[41] Sun Y, Sun L, Liu J T. Human comfort following behavior for service robots[C]//IEEE International Conference on Robotics and Biomimetics. Piscataway, USA:IEEE, 2017:649-654.
[42] Sun Y, Sun L, Liu J T. Performance evaluation of human comfortable following model for service robots[C]//IEEE Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems. Piscataway, USA:IEEE, 2018:144-147.
[43] Bernardin K, Stiefelhagen R. Evaluating multiple object tracking performance:The CLEAR MOT metrics[J]. EURASIP Journal on Image and Video Processing, 2008:No.246309. DOI:10.1155/2008/246309.