Hostname: page-component-745bb68f8f-mzp66 Total loading time: 0 Render date: 2025-02-10T22:49:11.907Z Has data issue: false hasContentIssue false

An autonomous localisation method of wall-climbing robots on large steel components with IMU and fixed RGB-D camera

Published online by Cambridge University Press:  10 February 2025

Wen Zhang
Affiliation:
Department of Mechanical Engineering, Tsinghua University, Beijing, China
Tianzhi Huang
Affiliation:
Department of Mechanical Engineering, Tsinghua University, Beijing, China
Zhenguo Sun*
Affiliation:
Department of Mechanical Engineering, Tsinghua University, Beijing, China
*
*Corresponding author: Zhenguo Sun; Email: [email protected]

Abstract

Wall-climbing robots work on large steel components with magnets, which limits the use of wireless sensors and magnetometers. This study aims to propose a novel autonomous localisation method (RGBD-IMU-AL) with an inertial measurement unit and a fixed RGB-D camera to improve the localisation performance of wall-climbing robots. The method contains five modules: calibration, tracking, three-dimensional (3D) reconstruction, location and attitude estimation. The calibration module is used to obtain the initial attitude angle. The tracking and 3D reconstruction module are used jointly to obtain the rough position and normal vector of the robot chassis. For the location module, a normal vector projection method is established to screen out the top point on the robot shell. An extended Kalman filter (EKF) is used to estimate the heading angle in the attitude estimation module. Experimental results show that the positioning error is within 0⋅02 m, and the positioning performance is better than that of the MS3D method. The heading angle error remains within 3⋅1°. The obtained results prove its applicability for the autonomous localisation in low-texture and magnetically disturbed environments.

Type
Research Article
Copyright
Copyright © The Author(s), 2025. Published by Cambridge University Press on behalf of The Royal Institute of Navigation

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Bassiri, A., Asghari Oskoei, M. and Basiri, A. (2018). Particle filter and finite impulse response filter fusion and hector SLAM to improve the performance of robot positioning. Journal of Robotics, 2018), 7806854, doi:10.1155/2018/7806854..CrossRefGoogle Scholar
Brossard, M. and Bonnabel, S. (2019). Learning Wheel Odometry and IMU Errors for Localization. Proceedings of 2019 International Conference on Robotics and Automation, IEEE, 291297. Montreal, QC, Canada.CrossRefGoogle Scholar
Engel, J., Koltun, V. and Cremers, D. (2017). Direct sparse odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(3), 611625.CrossRefGoogle ScholarPubMed
Faessler, M., Mueggler, E., Schwabe, K. and Scaramuzza, D. (2014). A Monocular Pose Estimation System Based on Infrared Leds. Proceedings of 2014 IEEE International Conference on Robotics and Automation, 907913. Hong Kong, China.CrossRefGoogle Scholar
Forster, C., Pizzoli, M. and Scaramuzza, D. (2014). SVO:Fast Semi-Direct Monocular Visual Odometry. Proceedings of 2014 IEEE International Conference on Robotics and Automation, 1522. Hong Kong, China.CrossRefGoogle Scholar
Fu, Q., Yu, H., Lai, L., Wang, J., Peng, X., Sun, W. and Sun, M. (2019). A robust RGB-D SLAM system with points and lines for low texture indoor environments. IEEE Sensors Journal, 19(21), 99089920.CrossRefGoogle Scholar
Furtado, J. S., Liu, H. H., Lai, G., Lacheray, H. and Desouza-Coelho, J. (2019). Comparative Analysis of Optitrack Motion Capture Systems. Proceedings of Advances in Motion Sensing and Control for Robotic Applications. Springer, 1531. Toronto, Canada.CrossRefGoogle Scholar
Gomez-Ojeda, R., Moreno, F. A., Zuniga-Noël, D., Scaramuzza, D. and Gonzalez-Jimenez, J. (2019). PL-SLAM: A stereo SLAM system through the combination of points and line segments. IEEE Transactions on Robotics, 35(3), 734746.CrossRefGoogle Scholar
Gunatilake, A., Thiyagarajan, K. and Kodagoda, S. (2021). Evaluation of Battery-Free UHF-RFID Sensor Wireless Signals for in-Pipe Robotic Applications. Proceedings of 2021 IEEE Sensors, IEEE, 14. Sydney, Australia.CrossRefGoogle Scholar
Gunatilake, A., Kodagoda, S. and Thiyagarajan, K. (2022). A novel UHF-RFID dual antenna signals combined with Gaussian process and particle filter for in-pipe robot localization. IEEE Robotics and Automation Letters, 7(3), 60056011.CrossRefGoogle Scholar
He, Y., Sun, W., Huang, H., Liu, J., Fan, H. and Sun, J. (2020). Pvn3d: A Deep Point-Wise 3d Keypoints Voting Network for 6dof Pose Estimation. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1163211641. Seattle, WA, US.CrossRefGoogle Scholar
Henriques, J. F., Caseiro, R., Martins, P. and Batista, J. (2014). High-speed tracking with kernelized correlation filters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(3), 583596.CrossRefGoogle Scholar
Kim, Y., An, J. and Lee, J. (2017). Robust navigational system for a transporter using GPS/INS fusion. IEEE Transactions on Industrial Electronics, 65(4), 33463354.CrossRefGoogle Scholar
Merriaux, P., Dupuis, Y., Boutteau, R., Vasseur, P. and Savatier, X. (2017). A study of vicon system positioning performance. Sensors, 17(7), 15911608.CrossRefGoogle ScholarPubMed
Mur-Artal, R. and Tardós, J. D. (2017). Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Transactions on Robotics, 33(5), 12551262.CrossRefGoogle Scholar
Newcombe, R. A., Izadi, S., Hilliges, O. and Molyneaux, D. (2011). Kinectfusion: Real-Time Dense Surface Mapping and Tracking. Proceedings of 2011 10th IEEE International Symposium on Mixed and Augmented Reality, 127136. Basel, Switzerland.CrossRefGoogle Scholar
Qi, C. R., Liu, W., Wu, C., Su, H. and Guibas, L. J. (2018). Frustum Pointnets for 3d Object Detection From rgb-D Data. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 918927. Salt Lake City, UT, US.CrossRefGoogle Scholar
Qin, T., Li, P. and Shen, S. (2018). Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 34(4), 10041020.CrossRefGoogle Scholar
Redmon, J. and Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv:1804.02767. https://arxiv.org/abs/1804.02767. Accessed 8 April 2018.Google Scholar
Rijalusalam, D. U. and Iswanto, I. (2021). Implementation kinematics modeling and odometry of four omni wheel mobile robot on the trajectory planning and motion control based microcontroller. Journal of Robotics and Control, 2(5), 448455.CrossRefGoogle Scholar
Romero-Ramirez, F. J., Muñoz-Salinas, R. and Medina-Carnicer, R. (2018). Speeded up detection of squared fiducial markers. Image and Vision Computing, 76, 3847.CrossRefGoogle Scholar
Shukla, A. and Karki, H. (2016). Application of robotics in offshore oil and gas industry—a review part II. Robotics and Autonomous Systems, 75, 508524.CrossRefGoogle Scholar
Su, Y., Wang, T., Shao, S., Yao, C. and Wang, Z. (2021). GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex terrain. Robotics and Autonomous Systems, 140, 103759.CrossRefGoogle Scholar
Tharwat, A. (2016). Principal component analysis: An overview. Pattern Recognit, 3(3), 197240.Google Scholar
Valenti, R. G., Dryanovski, I. and Xiao, J. (2015). Keeping a good attitude: A quaternion-based orientation filter for IMUs and MARGs. Sensors, 15(8), 1930219330.CrossRefGoogle Scholar
Venkatnarayan, R. H. and Shahzad, M. (2019). Enhancing indoor inertial odometry with wifi. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3(2), 127.CrossRefGoogle Scholar
Wang, Y. and Rajamani, R. (2018). Direction cosine matrix estimation with an inertial measurement unit. Mechanical Systems and Signal Processing, 109, 268284.CrossRefGoogle Scholar
Wang, C., Xu, D., Zhu, Y., Martín-Martín, R., Lu, C., Fei-Fei, L. and Savarese, S. (2019). Densefusion: 6d Object Pose Estimation by Iterative Dense Fusion. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 33433352. Long Beach, CA, US.CrossRefGoogle Scholar
Whelan, T., Salas-Moreno, R. F., Glocker, B., Davison, A. J. and Leutenegger, S. (2016). Elasticfusion: Real-time dense SLAM and light source estimation. The International Journal of Robotics Research, 35(14), 16971716.CrossRefGoogle Scholar
Wiedemeyer, T. (2015). IAI Kinect2. Institute for Artificial Intelligence, University Bremen. https://github.com/code-iai/iai_kinect2. Accessed 12 June 2015.Google Scholar
Yang, L., Li, B., Yang, G., Chang, Y., Liu, Z., Jiang, B. and Xiaol, J. (2019). Deep Neural Network Based Visual Inspection with 3d Metric Measurement of Concrete Defects Using Wall-Climbing Robot. Proceedings of 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, 28492854. Macau, China.CrossRefGoogle Scholar
Zhang, G., Shi, Y., Gu, Y. and Fan, D. (2017). Welding torch attitude-based study of human welder interactive behavior with weld pool in GTAW. Robotics and Computer-Integrated Manufacturing, 48, 145156.CrossRefGoogle Scholar
Zhao, Y. and Menegatti, E. (2018). Ms3d: Mean-Shift Object Tracking Boosted by Joint Back Projection of Color and Depth. Proceedings of International Conference on Intelligent Autonomous Systems. Springer , Cham, 222236. Baden-Baden, Germany.Google Scholar
Zhou, H., Yao, Z. and Lu, M. (2020). UWB/lidar coordinate matching method with anti-degeneration capability. IEEE Sensors Journal, 21(3), 33443352.CrossRefGoogle Scholar