Hostname: page-component-586b7cd67f-dlnhk Total loading time: 0 Render date: 2024-11-27T20:25:55.355Z Has data issue: false hasContentIssue false

Active camera stabilization to enhance the vision of agile legged robots

Published online by Cambridge University Press:  17 November 2015

Stéphane Bazeille*
Affiliation:
Department of Advanced Robotics, Istituto Italiano di Tecnologia, via Morego, 30, 16163 Genova, Italy. E-mails: [email protected], [email protected], [email protected], [email protected]
Jesus Ortiz
Affiliation:
Department of Advanced Robotics, Istituto Italiano di Tecnologia, via Morego, 30, 16163 Genova, Italy. E-mails: [email protected], [email protected], [email protected], [email protected]
Francesco Rovida
Affiliation:
Robotics, Vision and Machine Intelligence Lab, Aalborg University of Copenhagen AC Meyers Vaenge 15 DK-2450 Copenhagen, Denmark. E-mail: [email protected]
Marco Camurri
Affiliation:
Department of Advanced Robotics, Istituto Italiano di Tecnologia, via Morego, 30, 16163 Genova, Italy. E-mails: [email protected], [email protected], [email protected], [email protected]
Anis Meguenani
Affiliation:
Institut des Systemes Intelligents et de Robotique (ISIR), Université Pierre et Marie Curie Pyramide - Tour 55, 4 Place JUSSIEU 75005 Paris, France. E-mail: [email protected]
Darwin G. Caldwell
Affiliation:
Department of Advanced Robotics, Istituto Italiano di Tecnologia, via Morego, 30, 16163 Genova, Italy. E-mails: [email protected], [email protected], [email protected], [email protected]
Claudio Semini
Affiliation:
Department of Advanced Robotics, Istituto Italiano di Tecnologia, via Morego, 30, 16163 Genova, Italy. E-mails: [email protected], [email protected], [email protected], [email protected]
*
*Corresponding author. E-mail: [email protected]

Summary

Legged robots have the potential to navigate in more challenging terrains than wheeled robots. Unfortunately, their control is more demanding, because they have to deal with the common tasks of mapping and path planning as well as more specific issues of legged locomotion, like balancing and foothold planning. In this paper, we present the integration and the development of a stabilized vision system on the fully torque-controlled hydraulically actuated quadruped robot (HyQ). The active head added onto the robot is composed of a fast pan and tilt unit (PTU) and a high-resolution wide angle stereo camera. The PTU enables camera gaze shifting to a specific area in the environment (both to extend and refine the map) or to track an object while navigating. Moreover, as the quadruped locomotion induces strong regular vibrations, impacts or slippages on rough terrain, we took advantage of the PTU to mechanically compensate for the robot's motions. In this paper, we demonstrate the influence of legged locomotion on the quality of the visual data stream by providing a detailed study of HyQ's motions, which are compared against a rough terrain wheeled robot of the same size. Our proposed Inertial Measurement Unit (IMU)-based controller allows us to decouple the camera from the robot motions. We show through experiments that, by stabilizing the image feedback, we can improve the onboard vision-based processes of tracking and mapping. In particular, during the outdoor tests on the quadruped robot, the use of our camera stabilization system improved the accuracy on the 3D maps by 25%, with a decrease of 50% of mapping failures.

Type
Articles
Copyright
Copyright © Cambridge University Press 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Semini, C., Tsagarakis, N. G., Guglielmino, E., Focchi, M., Cannella, F. and Caldwell, D. G., “Design of HyQ - a hydraulically and electrically actuated quadruped robot,” J. Syst.Control Eng. 225 (6), 831849 (2011).Google Scholar
2. Barasuol, V., Buchli, J., Semini, C., Frigerio, M., De Pieri, E. R. and Caldwell, D. G., “A Reactive Controller Framework for Quadrupedal Locomotion on Challenging Terrain,” IEEE International Conference on Robotics and Automation (ICRA) (2013) pp. 2554–2561.Google Scholar
3. Havoutis, I., Semini, C., Buchli, J. and Caldwell, D. G., “Quadrupedal Trotting with Active Compliance,” IEEE International Conference on Mechatronics (ICM) (2013) pp. 2554–2561.Google Scholar
4. Focchi, M., Barasuol, V., Havoutis, I., Buchli, J., Semini, C. and Caldwell, D. G., “Local Reflex Generation for Obstacle Negotiation in Quadrupedal Locomotion,” International Conference on Climbing and Walking Robots (CLAWAR) (2013) pp. 610–616.Google Scholar
5. Bazeille, S., Barasuol, V., Focchi, M., Havoutis, I., Frigerio, M., Buchli, J., Semini, C. and Caldwell, D. G., “Vision Enhanced Reactive Locomotion Control for Trotting on Rough Terrain,” IEEE International Conference on Technologies for Practical Robot Applications (TePRA) (2013) pp. 1–6.Google Scholar
6. Havoutis, I., Ortiz, J., Bazeille, S., Barasuol, V., Semini, C. and Caldwell, D. G., “Onboard Perception-Based Trotting and Crawling with the Hydraulic Quadruped Robot (hyq),” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2013) pp. 6052–6057.Google Scholar
7. Winkler, A., Havoutis, I., Bazeille, S., Ortiz, J., Focchi, M., Dillmann, R., Caldwell, D. G. and Semini, C., “Path Planning with Force-Based Foothold Adaptation and Virtual Model Control for Torque Controlled Quadruped Robots,” IEEE ICRA (2014) pp. 6476–6482.Google Scholar
8. Kolter, J. Z., Youngjun, K. and Ng, A. Y., “Stereo Vision and Terrain Modeling for Quadruped Robots,” IEEE International Conference on Robotics and Automation (2009) pp. 1557–1564.Google Scholar
9. Filitchkin, P. and Byl, K., “Feature-Based Terrain Classification for Littledog,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2012) pp. 1387–1392.Google Scholar
10. Stelzer, A., Hirschmüller, H. and Görner, M., “Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain,” Int. J. Robot. Res. 31 (4), 381402 (2012).Google Scholar
11. Ma, J., Susca, S., Bajracharya, M., Matthies, L., Malchano, M. and Wooden, D., “Robust Multi-Sensor, Day/Night 6-dof Pose estimation for a Dynamic Legged Vehicle in gps-Denied Environments,” IEEE International Conference on Robotics and Automation (ICRA) (2012) pp. 619–626.Google Scholar
12. Wooden, D., Malchano, M., Blankespoor, K., Howardy, A., Rizzi, A. A. and Raibert, M., “Autonomous Navigation for Bigdog,” IEEE International Conference on Robotics and Automation (ICRA) IEEE (2010) pp. 4736–4741.Google Scholar
13. Bajracharya, M., Ma, J., Malchano, M., Perkins, A., Rizzi, A. and Matthies, L., “High Fidelity Day/Night Stereo Mapping with Vegetation and Negative Obstacle Detection for Vision-in-the-Loop Walking,” IEEE/RSJ IROS (2013) pp. 3663–3670.Google Scholar
14. Shao, X., Yang, Y. and Wang, W., “Obstacle Crossing with Stereo Vision for a Quadruped Robot,” International Conference on Mechatronics and Automation (ICMA), (2012) pp. 1738–1743.Google Scholar
15. Rawat, P. and Singhai, J., “Review of Motion Estimation and Video stabilization Techniques for Hand Held Mobile Video,” International Journal of Signal and Image Processing (SIPIJ) (2011).Google Scholar
16. Karazume, R. and Hirose, S., “Development of Image Stabilization System for Remote Operation of Walking Robots,” Proceedings IEEE International Conference on Robotics and Automation (2000) pp. 1856–1861.Google Scholar
17. Marcinkiewicz, M., Kunin, M., Parsons, S., Sklar, E. and Raphan, T., “Towards a Methodology for Stabilizing the Gaze of a Quadrupedal Robot,” ser. Lecture Notes in Computer Science, vol. 4434 (Springer, Berlin Heidelberg, 2007) pp. 540547.Google Scholar
18. Marcinkiewicz, M., Kaushik, R., Labutov, I., Parsons, S. and Raphan, T., “Learning to Stabilize the Head of a Walking Quadrupedal Robot using a Bioinspired Artificial Vestibular System,” Proceedings of the IEEE International Conference on Robotics and Automation (2008).Google Scholar
19. Giesbrecht, J., Goldsmith, P. and Pieper, J., “Control Strategies for Visual Tracking from a Moving Platform,” Proceedings of Canadian Conference on Electrical and Computer Engineering (CCECE) (2010) pp. 1–7.Google Scholar
20. Jung, B. and Sukhatme, G., “Real-time motion tracking from a mobile robot,” Int. J. Soc. Robot. 2, 6378 (2010).CrossRefGoogle Scholar
21. Gasteratos, A., “Tele-autonomous active stereo-vision head,” Int. J. Optomechatronics 2, 144161 (2008).CrossRefGoogle Scholar
22. Kang, T.-K., Zhang, H. and Park, G.-T., “Stereo-Vision Based Motion Estimation of a Humanoid Robot for the Ego-Motion Compensation by Type-2 Fuzzy Sets,” Proceedings of the IEEE International Symposium on Industrial Electronics (ISIE) (2009) pp. 1785–1790.Google Scholar
23. Ude, A. and Oztop, E., “Active 3-d Vision on a Humanoid Head,” Proceedings of the IEEE International Conference on Advanced Robotics (ICAR) (2009).Google Scholar
24. Bauml, B., Birbach, O., Wimbock, T., Frese, U., Dietrich, A. and Hirzinger, G., “Catching Flying Balls with a Mobile Humanoid: System Overview and Design Considerations,” International Conference on Humanoid Robots (2011) pp. 513–520.Google Scholar
25. Xiang, G., “Real-Time Follow-Up Tracking Fast Moving Object with an Active Camera,” Proceedings of the International Congress on Image and Signal Processing (CISP) (2009) pp. 1–4.Google Scholar
26. Akhloufi, M., “Real-Time Target Tracking using a Pan and Tilt Platform,” Proceedings of the International Conference on Machine Vision, Image Processing, and Pattern Analysis (MVIPPA) (2009) pp. 437–441.Google Scholar
27. Huebner, K., Welke, K., Przybylski, M., Vahrenkamp, N., Asfour, T., Kragic, D. and Dillmann, R., “Grasping Known Objects with Humanoid Robots: A Box-based Approach,” Proceedings of the International Conference on Advanced Robotics (2009) pp. 1–6.Google Scholar
28. Gay, S., Ijspeert, A. and Santos-Victor, J., “Predictive Gaze Stabilization During Periodic Locomotion Based on Adaptive Frequency Oscillators,” IEEE International Conference on Robotics and Automation (ICRA), (May 2012) pp. 271–278.Google Scholar
29. Shibata, T. and Schaal, S., “Biomimetic gaze stabilization based on feedback-error-learning with nonparametric regression networks,” Neural Netw. 14 (2), 201216 (2001).CrossRefGoogle ScholarPubMed
30. Panerai, F., Metta, G. and Sandini, G., “Learning visual stabilization reflexes in robots with moving eyes,” Neurocomputing 48 (1–4), 323337 (2002).Google Scholar
32.FLIR.” [Online]. Available: http://www.flir.com/mcs/view/?id=53679 Google Scholar
33. Kwon, O., Shin, J. and Paik, J., “Video Stabilization using Kalman Filter and Phase Correlation Matching,” In: Image Analysis and Recognition, vol. 3656 (Springer, Berlin Heidelberg, 2005) pp. 141148.Google Scholar
34. Lucas, B. D. and Kanade, T., “An iterative image registration technique with an application to stereo vision,” vol. 2, 1981, pp. 674–679.Google Scholar
35. Shi, J. and Tomasi, C., “Good Features to Track,” International Conference on Computer Vision and Pattern Recognition (1994) pp. 593–600.Google Scholar
36.ARTOR (Autonomous Rough Terrain Outdoor Robot),” [Online]. Available: http://www.artor.ethz.ch/ Google Scholar
37. Ortiz, J., Tapia, C., Rossi, L., Fontaine, J. and Maza, M., “Description and Tests of a Multisensorial Driving Interface for Vehicle Teleoperation,” Proceedings of the 11th International IEEE Conference on Intelligent Transportation Systems, ITSC, IEEE (2008) pp. 616–621.Google Scholar
38. Bradski, G., “Computer video face tracking for use in a perceptual user interface,” Intel Technol. J. (1998).Google Scholar
39. Endres, F., Hess, J., Engelhard, N., Sturm, J., Cremers, D. and Burgard, W., “An Evaluation of the rgb-d Slam System,” IEEE International Conference on Robotics and Automation (2012) pp. 1691–1696.Google Scholar
40. Szeliski, R. and Scharstein, D., “Symmetric Sub-Pixel Stereo Matching,” in ECCV 2002. Springer, vol. 2. (2002) pp. 525540.Google Scholar