Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-25T05:35:42.169Z Has data issue: false hasContentIssue false

Sound source tracking considering obstacle avoidance for a mobile robot

Published online by Cambridge University Press:  18 January 2010

Naoki Uchiyama*
Affiliation:
Department of Mechanical Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi 441-8580, Japan.
Shigenori Sano
Affiliation:
Department of Mechanical Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi 441-8580, Japan.
Akihiro Yamamoto
Affiliation:
Department of Mechanical Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka, Tempaku, Toyohashi, Aichi 441-8580, Japan.
*
*Correspending author. E-mail: [email protected]

Summary

Sound source tracking is an important function for autonomous robots, because sound is omni-directional and can be recognized in dark environment. This paper presents a new approach to sound source tracking for mobile robots using auditory sensors. We consider a general type of two-wheeled mobile robot that has wide industrial applications. Because obstacle avoidance is also an indispensable function for autonomous mobile robots, the robot is equipped with distance sensors to detect obstacles in real time. To deal with the robot's nonholonomic constraint and combine information from the auditory and distance sensors, we propose a model reference control approach in which the robot follows a desired trajectory generated by a reference model. The effectiveness of the proposed method is confirmed by experiments in which the robot is expected to approach a sound source while avoiding obstacles.

Type
Article
Copyright
Copyright © Cambridge University Press 2010

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Webb, B., “Robots, crickets and ants: Models of neural control of chemotaxis and phonotaxis,” Neural Netw. 11, 14791496 (1998).CrossRefGoogle ScholarPubMed
2.Faller, C. and Merimaa, J., “Sound localization in complex listening situations: Selection of binaural cues based on interaural coherence,” J. Acoust. Soc. Am. 116 (5), 30753089 (2004).CrossRefGoogle ScholarPubMed
3.Murray, J., Wermter, S. and Erwin, H., “Auditory robotic tracking of sound sources using hybrid cross-correlation and recurrent networks,” Proc. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, (2005) pp. 3554–3559.Google Scholar
4.Woo-han, Y., Cheon-in, O., Kyu-Dae, B. and Su-young, J., “The impulse sound source tracking using kalman filter and the cross-correlation,” Proc. International Joint Conference SICE-ICASE, (2006) pp. 317–320.Google Scholar
5.Nakadai, K., Hidai, K., Okuno, H. G., Mizoguchi, H. and Kitano, H., “Real-time auditory and visual multiple-speaker tracking for human-robot interaction,” IEEE J. Robot. Mechatronics, 479–489, JSME (2002).CrossRefGoogle Scholar
6.Oguno, H. G., Nakadai, K., Lourens, T. and Kitano, H., “Sound and visual tracking for humanoid robot,” Appl. Intell. 20, 253266 (2004).CrossRefGoogle Scholar
7.Lv, X., Zhang, M. and Liu, X., “A sound source tracking system based on robot hearing and vision,” Proc. 2008 International Conference on Computer Science and Software Engineering, (2008) pp. 1119–1122.Google Scholar
8.Hörnstein, J., Lopes, M., Santos-Victor, J. and Lacerda, F., “Sound localization for humanoid robots-building Audio-motor maps based on the HRTF,” IEEE/RSJ International Conference on Intelligent Robots and Systems, (2006).CrossRefGoogle Scholar
9.Ferreira, J. F., Pinho, C. and Dias, J., “Implementation and calibration of a bayesian binaural system for 3D localisation,” 2008 IEEE International Conference on Robotics and Biomimetics, (2008).CrossRefGoogle Scholar
10.Khatib, O., “Real-time obstacle avoidance for fast mobile robots,” Int. J. Robot. Res. 5 (1), 9098 (1986).CrossRefGoogle Scholar
11.Latombe, J. C., Robot Motion Planning (Kluwer Academic Publishers, 1991).CrossRefGoogle Scholar
12.Chakravarthy, A. and Ghose, D., “Obstacle avoidance in a dynamic environment: A collision cone approach,” IEEE Trans. Syst. Man Cybern. 28 (5), 562574 (1998).CrossRefGoogle Scholar
13.Borenstein, J. and Koren, Y., “Real-time obstacle avoidance for fast mobile robots,” IEEE Trans. Syst. Man Cybern. 19 (5), 11791187 (1989).CrossRefGoogle Scholar
14.Lamiraux, F., Bonnafous, D. and Lefebvre, O., “Reactive path deformation for nonholonomic mobile robots,” IEEE Trans. Robot. Autom. 20 (6), 967977 (2004).CrossRefGoogle Scholar
15.Fox, D., Burgard, W. and Thrun, S., “The dynamic window approach to collision avoidance,” IEEE Robot. Autom. Mag. 4, 2223 (1997).CrossRefGoogle Scholar
16.Ögren, P. and Leonard, N. E., “A convergent dynamic window approach to obstacle avoidance,” IEEE Trans. Robot. Autom. 21 (2), 188195 (2005).CrossRefGoogle Scholar
17.Huang, J. et al. , “A model-based sound localization system and its application to robot navigation,” Robot. Auton. Syst. 27, 199209 (1999).CrossRefGoogle Scholar
18.Bicho, E., Mallet, P. and Schöner, G., “Target representation on an autonomous vehicle with low-level sensors,” Int. J. Robot. Res. 19 (5), 424447 (2000).CrossRefGoogle Scholar
19.Andersson, S. B., Handzel, A. A., Shah, V. and Krishnaprasad, P., “Robot phonoaxis with dynamic sound-source localization,” IEEE International Conference on Robotics and Automation, (2004) pp. 4833–4838.Google Scholar
20.Ohnihi, K., Shibata, M. and Murakami, T., “Motion control for advanced mechtronics,” IEEE/ASME Trans. Mechatronics 1 (1), 5667 (1996).CrossRefGoogle Scholar