Detecting Static Conversation Group with Common Concern Area for Robot Navigation and Behavior Evaluation
ZHOU Lei1,2, ZHAO Kunxu1,2, SONG Yinuo1,2, LIU Jingtai1,2
1. Institute of Robotics and Automatic Information System, College of Artificial Intelligence, Nankai University, Tianjin 300350, China; 2. Tianjin Key Laboratory of Intelligent Robotics, Tianjin 300350, China
周磊, 赵坤旭, 宋一诺, 刘景泰. 考虑共同关注区域静态交谈群组检测的机器人导航及行为评价[J]. 机器人, 2022, 44(4): 494-503.DOI: 10.13973/j.cnki.robot.210276.
ZHOU Lei, ZHAO Kunxu, SONG Yinuo, LIU Jingtai. Detecting Static Conversation Group with Common Concern Area for Robot Navigation and Behavior Evaluation. ROBOT, 2022, 44(4): 494-503. DOI: 10.13973/j.cnki.robot.210276.
Abstract:An F-formation detection algorithm based on common concern areas is proposed for static conversation group. It takes the position and direction of pedestrians as input to construct the common concern area of groups. Then a sliding window-based maximum filter is used to detect the group center for clustering. After detecting the static conversation group, a group comfort space is constructed for a time-dependent A* path planning algorithm based on multi-layer costmap mechanism. Thus, mobile robot navigation considering group comfort is realized. In addition, it is challenging to quantitatively evaluate whether a robot’s navigation behavior is socially acceptable. A graph convolutional network-based model for evaluating robot behavior is constructed. Experimental results show that the evaluation network has a similar capacity to humans in evaluating the robot behavior. Visualization results show the rationality of the evaluation network’s results. According to the evaluation network, robots considering static conversation group can produce more comfortable trajectories.
[1] 韩建达,方勇纯,赵新,等.机器人的智能发育[J].人工智能, 2018(3):28-35.Han J D, Fang Y C, Zhao X, et al.Intelligent development of robots[J].AI-View, 2018(3):28-35. [2] 何玉庆,赵忆文,韩建达,等.与人共融——机器人技术发展的新趋势[J].机器人产业, 2015(5):74-80.He Y Q, Zhao Y W, Han J D, et al.Tri-Co robot-The new trend in the development of robotics[J].Robot Industry, 2015(5):74-80. [3] Truong X T, Ngo T D.Dynamic social zone based mobile robot navigation for human comfortable safety in social environments[J].International Journal of Social Robotics, 2016, 8:663-684. [4] Fox D, Burgard W, Thrun S.The dynamic window approach to collision avoidance[J].IEEE Robotics&Automation Magazine, 1997, 4(1):23-33. [5] Snape J, van den Berg J, Guy S J, et al.The hybrid reciprocal velocity obstacle[J].IEEE Transactions on Robotics, 2011, 27(4):696-706. [6] Roesmann C, Feiten W, Woesch T, et al.Trajectory modification considering dynamic constraints of autonomous robots[C]//7th German Conference on Robotics.Munich, Germany:VDE, 2012:1-6. [7] Rosmann C, Feiten W, Wosch T, et al.Efficient trajectory optimization using a sparse model[C]//European Conference on Mobile Robots.Piscataway, USA:IEEE, 2013:138-143. [8] Foster M E, Alami R, Gestranius O, et al.The MuMMER project:Engaging human-robot interaction in real world public spaces[M]//Lecture Notes in Computer Science, Vol.9979.Berlin, Germany:Springer, 2016:753-763. [9] Lead T, Lead W P.Safe robot navigation in dense crowds[R/OL].(2020-05-26)[2021-04-25].https://cordis.europa.eu/project/id/779942/reporting. [10] Papadakis P, Rives P, Spalanzani A.Adaptive spacing in humanrobot interactions[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Piscataway, USA:IEEE, 2014:2627-2632. [11] 张森,刘景泰.基于多维度服务情景的人的舒适需求建模[J].机器人, 2019, 41(4):493-506.Zhang S, Liu J T.Modeling of human's comfort needs based on multi-dimensional service situations[J].Robot, 2019, 41(4):493-506. [12] 周磊,张森,赵英利,等.基于非对称高斯函数的个人/群组动态舒适空间建模[J].机器人, 2021, 43(3):257-268.Zhou L, Zhang S, Zhao Y L, et al.Modeling of personal/group dynamic comfort space based on asymmetric Gaussian function[J].Robot, 2021, 43(3):257-268. [13] 张森,周磊,刘梦,等.一种面向狭小、拥挤情景的服务机器人运动规划方法[J].机器人, 2021, 43(3):269-278.Zhang S, Zhou L, Liu M, et al.A motion planning scheme of service robots for cramped and crowded situations[J].Robot, 2021, 43(3):269-278. [14] Kendon A.Spatial organization in social encounters:The Fformation system[J].Man Environment Systems, 1976, 6:291-296. [15] Setti F, Russell C, Bassetti C, et al.F-formation detection:Individuating free-standing conversational groups in images[J].PLoS ONE, 2015, 10(9).DOI:10.1371/journal.pone.0123783. [16] Setti F, Lanz O, Ferrario R, et al.Multi-scale F-formation discovery for group detection[C]//IEEE International Conference on Image Processing.Piscataway, USA:IEEE, 2013:3547-3551. [17] Setti F, Hung H, Cristani M.Group detection in still images by F-formation modeling:A comparative study[C]//14th International Workshop on Image Analysis for Multimedia Interactive Services.Piscataway, USA:IEEE, 2013.DOI:10.1109/WIAMIS.2013.6616147. [18] Akbari A, Farsi H, Mohamadzadeh S.Deep neural network with extracted features for social group detection[J].Journal of Electrical and Computer Engineering Innovations, 2021, 9(1):47-56. [19] Yang F K, Peters C.AppGAN:Generative adversarial networks for generating robot approach behaviors into small groups of people[C]//28th IEEE International Conference on Robot and Human Interactive Communication.Piscataway, USA:IEEE, 2019.DOI:10.1109/RO-MAN46459.2019.8956425. [20] Yang F K, Yin W J, Bjorkman M, et al.Impact of trajectory generation methods on viewer perception of robot approaching group behaviors[C]//29th IEEE International Conference on Robot and Human Interactive Communication.Piscataway, USA:IEEE, 2020:509-516. [21] Yang F K, Peters C.App-LSTM:Data-driven generation of socially acceptable trajectories for approaching small groups of agents[C]//7th International Conference on Human-Agent Interaction.New York, USA:ACM, 2019:144-152. [22] Sathyamoorthy A J, Patel U, Paul M, et al.CoMet:Modeling group cohesion for socially compliant robot navigation in crowded scenes[J].IEEE Robotics and Automation Letters, 2022, 7(2):1008-1015. [23] Katyal K, Gao Y X, Markowitz J, et al.Learning a group-aware policy for robot navigation[DB/OL].(2020-12-22)[2021-04-26].https://arxiv.org/abs/2012.12291. [24] Vega A, Manso L J, Macharet D G, et al.Socially aware robot navigation system in human-populated and interactive environments based on an adaptive spatial density function and space affordances[J].Pattern Recognition Letters, 2019, 118:72-84. [25] Bera A, Randhavane T, Prinja R, et al.SocioSense:Robot navigation amongst pedestrians with social and psychological constraints[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Piscataway, USA:IEEE, 2017:7018-7025. [26] Lu D V, Hershberger D, Smart W D.Layered costmaps for context-sensitive navigation[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Piscataway, USA:IEEE, 2014:709-715. [27] Quigley M, Conley K, Gerkey B, et al.ROS:An open source robot operating system[C]//ICRA Workshop on Open Source Software.Piscataway, USA:IEEE, 2009:3-32. [28] Kollmitz M, Hsiao K, Gaa J, et al.Time dependent planning on a layered social cost map for human-aware robot navigation[C]//European Conference on Mobile Robots.Piscataway, USA:IEEE, 2015.DOI:10.1109/ECMR.2015.7324184. [29] Papenmeier F, Uhrig M, Kirsch A.Human understanding of robot motion:The role of velocity and orientation[J].International Journal of Social Robotics, 2019, 11:75-88. [30] Manso L J, Nunez P, Calderita L V, et al.SocNav1:A dataset to benchmark and learn social navigation conventions[J].Data, 2020, 5(1).DOI:10.3390/data5010007. [31] Baghel R, Kapoor A, Bachiller P, et al.A toolkit to generate social navigation datasets[M]//Advances in Intelligent Systems and Computing, Vol.1285.Berlin, Germany:Springer, 2020:180-193. [32] Kip T N, Welling M.Semi-supervised classification with graph convolutional networks[DB/OL].(2018-09-09)[2020-09-30].https://arxiv.org/abs/1609.02907.