Hostname: page-component-cd9895bd7-mkpzs Total loading time: 0 Render date: 2024-12-25T05:50:20.008Z Has data issue: false hasContentIssue false

A novel line of sight control system for a robot vision tracking system, using vision feedback and motion-disturbance feedforward compensation

Published online by Cambridge University Press:  12 April 2012

Jaehong Park
Affiliation:
LG Display Co., LTD, Gumi, 730-726, Korea
Wonsang Hwang
Affiliation:
LG Display Co., LTD, Gumi, 730-726, Korea
Hyunil Kwon
Affiliation:
Multimedia R & D Lab., LG Electronics Inc., Seocho-gu, Seoul, 137-130, Korea
Kwangsoo Kim
Affiliation:
Department of Control and Instrumentation Engineering, Hanbat National University, Yuseong-gu, Deajeon, 305-719, Korea
Dong-il “Dan” Cho*
Affiliation:
ASRI/ISRC/Department of Electrical Engineering and Computer Science, Seoul National University, Kwanak-ku, Seoul, 151-742, Korea
*
*Corresponding author. E-mail: [email protected]

Summary

This paper presents a novel line of sight control system for a robot vision tracking system, which uses a position feedforward controller to preposition a camera, and a vision feedback controller to compensate for the positioning error. Continuous target tracking is an important function for service robots, surveillance robots, and cooperating robot systems. However, it is difficult to track a specific target using only vision information, while a robot is in motion. This is especially true when a robot is moving fast or rotating fast. The proposed system controls the camera line of sight, using a feedforward controller based on estimated robot position and motion information. Specifically, the camera is rotated in the direction opposite to the motion of the robot. To implement the system, a disturbance compensator is developed to determine the current position of the robot, even when the robot wheels slip. The disturbance compensator is comprised of two extended Kalman filters (EKFs) and a slip detector. The inputs of the disturbance compensator are data from an accelerometer, a gyroscope, and two wheel-encoders. The vision feedback information, which is the targeting error, is used as the measurement update for the two EKFs. Using output of the disturbance compensator, an actuation module pans the camera to locate a target at the center of an image plane. This line of sight control methodology improves the recognition performance of the vision tracking system, by keeping a target image at the center of an image frame. The proposed system is implemented on a two-wheeled robot. Experiments are performed for various robot motion scenarios in dynamic situations to evaluate the tracking and recognition performance. Experimental results showed the proposed system achieves high tracking and recognition performances with a small targeting error.

Type
Articles
Copyright
Copyright © Cambridge University Press 2012

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Lenderman, M., Experience the Message: How Experiential Marketing Is Changing the Brand World (Carroll & Graf Publishers, New York, 2006).Google Scholar
2.Kragić, D. and Vincze, M., “Vision for robotics,” Found. Trends Robot. 1 (1), 178 (2010).Google Scholar
3.Nadimi, S. and Bhanu, B., “Multistrategy Fusion Using Mixture Model for Moving Object Detection,” Proceedings of the International Conference on Multisensor Fusion and Integration for Intelligent Systems 2001, Germany (Aug. 20–22, 2001) pp. 317322.Google Scholar
4.Jia, Z., Balasuriya, A. and Challa, S., “Vision based target tracking for autonomous land vehicle navigation: A brief survey,” Recent Patents Comput. Sci. 2, 3242 (2009).Google Scholar
5.Pinto, N., Cox, D. D. and DiCarlo, J. J., “Why is real-world visual object recognition hard?,” PLoS Comp. Bio. 4 (1), 151156 (2004).Google Scholar
6.Kragi, D.ć and Christensen, H. H., “Cue integration for visual servoing,” IEEE Trans. Robot. Autom. 17 (1), 1827 (2001).CrossRefGoogle Scholar
7.Cai, Y., Freitas, N. de and Little, J. J., “Robust Visual Tracking for Multiple Targets,” Proceedings of the European Conference Computer Vision, Austria (May 7–13, 2006) pp. 107118.Google Scholar
8.Nillius, P., Sullivan, J. and Carlsson, S., “Multi-Target Tracking - Linking Identities Using Baysian Network Inference,” Proceedings of the 2006 IEEE Computer Vision and Pattern Recognition (CVPR 2006), Wasington, DC (Jun. 17–22, 2006) pp. 21872194.Google Scholar
9.Jung, B. and Sukhatme, G. S., “Real-time motion tracking from a mobile robot,” Int. J. Soc. Robot., 2 (1), 6378 (2010).CrossRefGoogle Scholar
10.Dai, M., Raphan, T. and Cohen, B., “Adaptation of the angular vestibulo-ocular reflex to head movements in rotating frames of reference,” Exp. Brain Res. 195 (4), 553567 (2009).CrossRefGoogle ScholarPubMed
11.Hwangbo, M., Kim, J.-S. and Kanade, T., “Inertial-Aided KLT Feature Tracking for a Moving Camera,” Proceedings of the 2009 IEEE/RSJ International Conference Intelligent Robots and Systems, St. Louis, MO (Oct. 10–15, 2009) pp. 19091916.Google Scholar
12.Jia, Z., Balasuriya, A. and Challa, S., “Sensor fusion-based visual target tracking for autonomous vehicles with the out-of-sequence measurements solution,” Robot. Auton. Syst. 56 (2), 57176 (2006).Google Scholar
13.Ribo, M., Bradner, M. and Pinz, A., “A flexible software architecture for hybrid tracking,” J. Robot. Syst. 21 (2), 5363 (2004).CrossRefGoogle Scholar
14.Xu, D. and Li, Y. F., “A New Pose Estimation Method based on Inertial and Visual Sensors for Autonomous Robots,” Proceedings of the IEEE International Conference on Robotics and Biomimetics, Sanya, China (Dec. 15–18, 2007) pp. 405410.Google Scholar
15.Lobo, L. and Dias, J., “Inertial sensed ego-motion for 3D vision,” J. Robot. Syst., 21 (1), 312 (2004).CrossRefGoogle Scholar
16.Shibata, T. and Schaal, S., “Biomimetic gaze stabilization based on feedback-error-learning with nonparametric regression networks,” Neural Netw. 14, 201216 (2007).CrossRefGoogle Scholar
17.Xie, S., Luo, J., Gong, Z., Ding, W., Zou, H. and Fu, X., “Biomimetic Control of Pan-tilt-zoom Camera for Visual Tracking Based-on an Autonomous Helicopter,” Proceedings of the 2007 IEEE/RSJ International Conference Intelligent Robots and Systems, San Diego, CA (Oct. 29 – Nov. 2, 2007) pp. 21382143.Google Scholar
18.Lenz, A., Balakrishnan, T., Pipe, A. G. and Melhuish, C., “An adaptive gaze stabilization controller inspired by the vestibulo-ocular reflex,” Bioinsp. Biomim. 3, 111 (2008).CrossRefGoogle ScholarPubMed
19.Ouh, H. K., Bahn, W., Park, J., Hwang, W. S., Kwon, H. I., Kim, K. S. and Cho, D. I., “VOR Based Target Tracking System with an Accelerometer for Mobile Robots,” Proceedings of the 6th International Conferenceon Ubiquitous Robots and Ambient Intelligence, Gwangju, Korea (Oct. 28–31, 2009) pp. 133137.Google Scholar
20.Dieringer, N., “Vestibular system,” In: Encyclopedia of Life Sciences (John Wiley & Sons, Hoboken, NJ, 2006).Google Scholar
21.Lim, J. S., Lee, T. H., Ahn, T. D., Yoo, K. H., Lee, A. R. and Cho, D. I., “Sensor Fusion for Mobile Robot Navigation Using Accelerometers and Odometer,” Proceedings of the 13th International Conferenceon Advanced Robotics, Jeju, Korea (Aug. 21–24, 2007) pp. 343348.Google Scholar
22.Mazl, R. and Preucil, L., “Vehicle localization using inertial sensors and GPS,” Springer Tracts Adv. Robot. 24, 135144 (2006).Google Scholar
23.Oppenheim, A, V. and Schafer, R. W., Discrete-Time Signal Processing, 3rd Ed. (Pearson Prentice Hall, Upper Saddle River, NJ, 2010).Google Scholar
24.Tsai, R. Y., “A versatile camera calibration technique for high accuracy 3D machine vision metrology using off-the-shelf TV cameras lenses,” IEEE Trans. Robot. Autom. 3 (2), 323344 (1987).CrossRefGoogle Scholar
25.Bay, H., Tuytelaars, T. and Gool, L. V., “Speeded-up robust features (SURF),” Comput. Vis. Image Underst. 110 (3), 346359 (2008).CrossRefGoogle Scholar
26.Alessandri, A., Bartolini, G., Pavanati, P., Punta, E. and Vinci, A., “An Application of the Extended Kalman Filter for Integrated Navigation in Mobile Robotics,” Proceedings of the American Control Conference 1997, Albuquerque, NM, (Jun. 4–6, 1997) pp. 527531.Google Scholar
27.Park, J., Hwang, W., Kwon, H., Kim, J., Lee, C., Anjum, M. L., Kim, K. and Cho, D., “High Performance Vision Tracking System for Mobile Robot Using Sensor Data Fusion with Kalman Filter,” Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan (Oct. 18–22, 2010), pp. 37783783.Google Scholar
28.Ojeda, L., Reina, G., Cruz, D. and Borenstein, J., “The FLEXnav precision dead-reckoning system,” Int. J. Veh. Auton. Syst. 4 (2–4), 173195 (2008).CrossRefGoogle Scholar
29.Simon, D., Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches (John Wiley & Sons, Hoboken, NJ, 2006).CrossRefGoogle Scholar
30.Armesto, L., Tornero, J. and Vincze, M., “On multi-rate fusion for non-linear sampled-data systems: Application to a 6D tracking system,” Robot. Auton. Syst. 56 (8), 706715 (2008).CrossRefGoogle Scholar
31.Ojeda, L., Reina, G. and Borenstein, J., “Experimental Results from FLEXnav: An Expert Rule-Based Dead-Reckoning System for Mars Rovers,” Proceedings of the 2004 IEEE Aerospace Conference, Big Sky, MT, USA (Mar. 6–13, 2004) pp. 816825.Google Scholar
32.Mourikis, A. I., Roumeliotis, S. I. and Burdick, J. W., “SC-KF mobile robot localization: a stochastic cloning kalman filter for processing relative-state measurements,” IEEE Trans. Robot. 52 (10), 18131825 (2007).Google Scholar
33.Seyr, M. and Jakubek, S., “Proprioceptive Navigation, Slip Estimation and Slip Control for Autonomous Wheeled Mobile Robots,” Proceedings of the 2006 IEEE International Conference on Robotics, Automation and Mechatronics, Bangkok, Thailand (Jun. 7–9, 2006) pp. 16.Google Scholar