Hostname: page-component-cc8bf7c57-7lvjp Total loading time: 0 Render date: 2024-12-11T22:09:09.924Z Has data issue: false hasContentIssue false

Autonomous Social Robot Navigation using a Behavioral Finite State Social Machine

Published online by Cambridge University Press:  05 May 2020

Vaibhav Malviya*
Affiliation:
Centre of Intelligent Robotics, Indian Institute of Information Technology, Allahabad, Prayagraj, India. E-mails: [email protected], [email protected]
Arun Kumar Reddy
Affiliation:
Centre of Intelligent Robotics, Indian Institute of Information Technology, Allahabad, Prayagraj, India. E-mails: [email protected], [email protected]
Rahul Kala
Affiliation:
Centre of Intelligent Robotics, Indian Institute of Information Technology, Allahabad, Prayagraj, India. E-mails: [email protected], [email protected]
*
*Corresponding author. E-mail: [email protected]

Summary

We present a robot navigation system based on Behavioral Finite State Social Machine. The paper makes a robot operate as a social tour guide that adapts its navigation based on the behavior of the visitors. The problem of a robot leading a human group with a limited field-of-view vision is relatively untouched in the literature. Uncertainties arise when the visitors are not visible, wherein the behavior of the robot is adapted as a social response. Artificial potential field is used for local planning, and a velocity manager sets the speed disproportional to time duration of missing visitors.

Type
Articles
Copyright
Copyright © The Author(s) 2020. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Dautenhahn, K., Woods, S., Kaouri, C., Walters, M. L., Koay, K. L. and Werry, I., “What is a robot companion-friend, assistant or butler?.” In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE (2005) pp. 11921197.Google Scholar
Zelinsky, A., “A mobile robot exploration algorithm.IEEE Trans Robot Autom 8(6), 707717 (1992).10.1109/70.182671CrossRefGoogle Scholar
Franchi, A., Freda, L., Oriolo, G. and Vendittelli, M., “A randomized strategy for cooperative robot exploration.” In Proceedings 2007 IEEE International Conference on Robotics and Automation, IEEE (2007) pp. 768774.Google Scholar
Reif, J. H. and Wang, H., “Social potential fields: A distributed behavioral control for autonomous robots.Robot Auton Syst 27(3), 171194 (1999).10.1016/S0921-8890(99)00004-4CrossRefGoogle Scholar
Rodriguez, M., Ali, S. and Kanade, T., “Tracking in unstructured crowded scenes.” In 2009 IEEE 12th International Conference on Computer Vision, IEEE (2009) pp. 13891396.Google Scholar
Luber, M., Stork, J. A., Tipaldi, G. D. and Arras, K. O., “People tracking with human motion predictions from social forces.” In 2010 IEEE International Conference on Robotics and Automation, IEEE (2010) pp. 464469.Google Scholar
Kratz, L. and Nishino, K., “Tracking pedestrians using local spatio-temporal motion patterns in extremely crowded scenes.IEEE Trans Pattern Anal Mach Intell 34(5), 9871002 (2012).10.1109/TPAMI.2011.173CrossRefGoogle ScholarPubMed
Fulgenzi, C., Spalanzani, A. and Laugier, C., “Dynamic obstacle avoidance in uncertain environment combining PVOs and occupancy grid.” In Proceedings 2007 IEEE International Conference on Robotics and Automation, IEEE (2007) pp. 16101616.Google Scholar
Du Toit, N. E. and Burdick, J. W., “Robotic motion planning in dynamic, cluttered, uncertain environments.” In 2010 IEEE International Conference on Robotics and Automation, IEEE (2010) pp. 966973.Google Scholar
Svenstrup, M., Bak, T. and Andersen, H. J., “Trajectory planning for robots in dynamic human environments.” In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE (2010) pp. 42934298.Google Scholar
Pradhan, N., Burg, T. and Birchfield, S., “Robot crowd navigation using predictive position fields in the potential function framework.” In Proceedings of the 2011 American Control Conference, IEEE (2011) pp. 46284633.Google Scholar
Helble, H. and Cameron, S., “3-d path planning and target trajectory prediction for the oxford aerial tracking system.” In Proceedings 2007 IEEE International Conference on Robotics and Automation, IEEE (2007) pp. 10421048.Google Scholar
Thompson, S., Horiuchi, T. and Kagami, S., “A probabilistic model of human motion and navigation intent for mobile robot path planning.” In 2009 4th International Conference on Autonomous Robots and Agents, IEEE (2009) pp. 663668.Google Scholar
Large, F., Vasquez, D., Fraichard, T. and Laugier, C., “Avoiding cars and pedestrians using velocity obstacles and motion prediction.” In IEEE Intelligent Vehicles Symposium, IEEE (2004) pp. 375379.Google Scholar
Bennewitz, M., Burgard, W., Cielniak, G. and Thrun, S., “Learning motion patterns of people for compliant robot motion.Int J Robot Res 24(1), 3148 (2005).10.1177/0278364904048962CrossRefGoogle Scholar
Khatib, O., “Real-time obstacle avoidance for manipulators and mobile robots.” In Proceedings. 1985 IEEE International Conference on Robotics and Automation, vol. 2, IEEE (1985) pp. 500505.Google Scholar
Reynolds, C. W., “Flocks, herds and schools: A distributed behavioral model.” Vol. 21(4), ACM (1987) pp. 25–34.Google Scholar
Chen, Q. and Luh, J. Y. S., “Coordination and control of a group of small mobile robots.” In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, IEEE (1994) pp. 23152320.Google Scholar
Wang, P. K. C., “Navigation strategies for multiple autonomous mobile robots moving in formation.J Robot Syst 8(2), 177195 (1991).10.1002/rob.4620080204CrossRefGoogle Scholar
Mata, M., Armingol, J. M., la Escalera, A. de and Salichs, M. A., “A visual landmark recognition system for topological navigation of mobile robots.” In Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), vol. 2, pp. 11241129 (IEEE, 2001).Google Scholar
Wen, F., Yuan, K., Zou, W., Chai, X. and Zheng, R., “Visual navigation of an indoor mobile robot based on a novel artificial landmark system.” In 2009 International Conference on Mechatronics and Automation, IEEE (2009) pp. 37753780.Google Scholar
Burgard, W., Cremers, A. B., Fox, D., Hähnel, D., Lakemeyer, G., Schulz, D., Steiner, W. and Thrun, S., “Experiences with an interactive museum tour-guide robot.Artif Intell 114(1–2), 355 (1999).10.1016/S0004-3702(99)00070-3CrossRefGoogle Scholar
Sviestins, E., Mitsunaga, N., Kanda, T., Ishiguro, H. and Hagita, N., “Speed adaptation for a robot walking with a human.” In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, ACM (2007) pp. 349356.Google Scholar
Zhang, K., Zhang, Z., Li, Z. and Qiao, Y., “Joint face detection and alignment using multitask cascaded convolutional networks.IEEE Signal Process Lett 23(10), 14991503 (2016).10.1109/LSP.2016.2603342CrossRefGoogle Scholar
Li, H., Lin, Z., Shen, X., Brandt, J. and Hua, G., “A convolutional neural network cascade for face detection.” In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2015) pp. 53255334.Google Scholar
He, K., Zhang, X., Ren, S. and Sun, J., “Delving deep into rectifiers: Surpassing human-level performance on imagenet classification.” In Proceedings of the IEEE International Conference on Computer Vision, (2015) pp. 10261034.Google Scholar
Krizhevsky, A., Sutskever, I and Hinton, G. E., “Imagenet classification with deep convolutional neural networks.” In Advances in Neural Information Processing Systems, (2012) pp. 10971105.Google Scholar

Malviya et al. supplementary material

Supplementary Video

Download Malviya et al. supplementary material(Video)
Video 38.7 MB