Hostname: page-component-5cf477f64f-h6p2m Total loading time: 0 Render date: 2025-04-07T07:43:17.087Z Has data issue: false hasContentIssue false

DV-LIO: LiDAR-inertial Odometry based on dynamic merging and smoothing voxel

Published online by Cambridge University Press:  02 April 2025

Chenyu Shen
Affiliation:
Institute of Robotics and Automatic Information System, College of Artificial Intelligence, Nankai University, Tianjin, China
Wanbiao Lin
Affiliation:
Shenzhen research institute of Nankai University, Shenzhen, China
Siyang Sun
Affiliation:
Institute of Robotics and Automatic Information System, College of Artificial Intelligence, Nankai University, Tianjin, China
Wenlan Ouyang
Affiliation:
Institute of Robotics and Automatic Information System, College of Artificial Intelligence, Nankai University, Tianjin, China
Bohan Shi
Affiliation:
Institute of Robotics and Automatic Information System, College of Artificial Intelligence, Nankai University, Tianjin, China
Lei Sun*
Affiliation:
Institute of Robotics and Automatic Information System, College of Artificial Intelligence, Nankai University, Tianjin, China
*
Corresponding author: Lei Sun; Email: [email protected]

Abstract

This paper proposes a LiDAR-inertial odometry (LIO) based on the dynamic voxel merging and smoothing method, DV-LIO. In this approach, a local map management mechanism based on feature distribution is introduced to unify the features of similar adjacent voxels through dynamic merging and segmentation, thereby improving the perceptual consistency of environmental features. Moreover, a novel noise detector that performs noise detection and incremental filtering by evaluating the consistency of voxel features is designed to further reduce local map noise and improve mapping accuracy while ensuring real-time algorithm performance. Meanwhile, to ensure the computational efficiency of the LIO system, a point cache is set for each voxel, which allows the voxel to be updated incrementally and intermittently. The proposed method is extensively evaluated on datasets gathered over various environments, including campus, park, and unstructured gardens.

Type
Research Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Zhang, Y., Wang, L., Jiang, X., Zeng, Y. and Dai, Y., “An efficient LiDAR-based localization method for self-driving cars in dynamic environments,” Robotica 40(1), 3855 (2022).Google Scholar
Li, H., Savkin, A. V. and Vucetic, B., “Autonomous area exploration and mapping in underground mine environments by unmanned aerial vehicles,” Robotica 38(3), 442456 (2020).Google Scholar
Fasiolo, D. T., Scalera, L. and Maset, E., “Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping,” Robotica 41(9), 25882604 (2023).Google Scholar
Bentley, J. L., “Multidimensional binary search trees used for associative searching,” Commun. ACM 18(9), 509517 (1975).Google Scholar
Friedman, J. H., Bentley, J. L. and Finkel, R. A., “An algorithm for finding best matches in logarithmic expected time,” ACM Trans. Math. Softw. (TOMS) 3(3), 209226 (1977).Google Scholar
Zhang, J. and Singh, S., “Loam: LiDAR odometry and mapping in real-time,” Robot. Sci. Syst. 2(9), 19 (2014).Google Scholar
Shan, T. and Englot, B.. “Lego-loam: Lightweight and Ground-optimized LiDAR Odometry and Mapping on Variable Terrain.” In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE, 2018) pp. 47584765.CrossRefGoogle Scholar
Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C. and Rus, D.. “Lio-sam: Tightly-coupled LiDAR Inertial Odometry Via Smoothing and Mapping.” In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE, 2020) pp. 51355142.Google Scholar
Qin, C., Ye, H., Pranata, C. E., Han, J., Zhang, S. and Liu, M.. “Lins: A LiDAR-inertial State Estimator for Robust and Efficient Navigation.” In: 2020 IEEE International Conference on Robotics and Automation (ICRA) (IEEE, 2020) pp. 88998906.CrossRefGoogle Scholar
Xu, W., Cai, Y., He, D., Lin, J. and Zhang, F., “FAST-LIO2: Fast direct LiDAR-inertial odometry,” IEEE Trans. Robot. 38(4), 20532073 (2022).Google Scholar
He, D., Xu, W., Chen, N., Kong, F., Yuan, C. and Zhang, F., “Point-lio: Robust high-bandwidth light detection and ranging inertial odometry,” Adv. Intell. Syst. 5(7), 2200459 (2023).CrossRefGoogle Scholar
Bai, C., Xiao, T., Chen, Y., Wang, H., Zhang, F. and Gao, X., “Faster-lio: Lightweight tightly coupled lidar-inertial odometry using parallel sparse incremental voxels,” IEEE Robot. Autom. Lett. 7(2), 48614868 (2022).CrossRefGoogle Scholar
Lim, H., Kim, D., Kim, B. and Myung, H.. “Adalio: Robust Adaptive LiDAR-inertial Odometry in Degenerate Indoor Environments.” In: 2023 20th International Conference on Ubiquitous Robots (UR) (IEEE, 2023) pp. 4853.CrossRefGoogle Scholar
Yokozuka, M., Koide, K., Oishi, S. and Banno, A.. “Vitamin: Lidar-based Tracking and Mapping by Stabilized ICP for Geometry Approximation with Normal Distributions.” In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE, 2020) pp. 51435150.CrossRefGoogle Scholar
Besl, P. J. and McKay, N. D., “Method for Registration of 3-d Shapes,” In: Sensor Fusion IV: Control Paradigms and Data Structures. vol.1611 (SPIE, 1992) pp. 586606.CrossRefGoogle Scholar
Yokozuka, M., Koide, K., Oishi, S. and Banno, A.. “Litamin2: Ultra Light LiDAR-based Slam Using Geometric Approximation Applied with kl-Divergence.” In: 2021 IEEE International Conference on Robotics and Automation (ICRA) (IEEE, 2021) pp. 1161911625.CrossRefGoogle Scholar
Segal, A., Haehnel, D. and Thrun, S., “Generalized-icp,” Robot. Sci. Syst. 2(4), 435 (2009).Google Scholar
Koide, K., Yokozuka, M., Oishi, S. and Banno, A.. “Voxelized Gicp for Fast and Accurate 3D Point Cloud Registration.” In: 2021 IEEE International Conference on Robotics and Automation (ICRA) (IEEE, 2021) pp. 1105411059.CrossRefGoogle Scholar
Li, R., Zhang, X., Zhang, S., Yuan, J., Liu, H. and Wu, S., “BA-LIOM: Tightly coupled laser-inertial odometry and mapping with bundle adjustment,” Robotica 42(3), 684700 (2024).CrossRefGoogle Scholar
Biber, P. and Straßer, W., “The Normal Distributions Transform: A New Approach to Laser Scan Matching,” In: Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No. 03CH37453) (IEEE, 2003) pp. 27432748.CrossRefGoogle Scholar
Yuan, C., Xu, W., Liu, X., Hong, X. and Zhang, F., “Efficient and probabilistic adaptive voxel mapping for accurate online lidar odometry,” IEEE Robot. Autom. Lett. 7(3), 85188525 (2022).CrossRefGoogle Scholar
Wu, C., You, Y., Yuan, Y., Kong, X., Zhang, Y., Li, Q. and Zhao, K., “Voxelmap++: Mergeable voxel mapping method for online lidar (-inertial) odometry,” IEEE Robot. Autom. Lett. 9(1), 427434 (2023).CrossRefGoogle Scholar
Liu, Z. and Zhang, F., “Balm: Bundle adjustment for lidar mapping,” IEEE Robot. Autom. Lett. 6(2), 31843191 (2021).CrossRefGoogle Scholar
Dellenbach, P., Deschaud, J.-E., Jacquet, B. and Goulette, F.. “Ct-icp: Real-time Elastic LiDAR Odometry with Loop Closure.” In: 2022 International Conference on Robotics and Automation (ICRA) (IEEE, 2022) pp. 55805586.CrossRefGoogle Scholar
Yin, J., Li, A., Li, T., Yu, W. and Zou, D., “M2dgr: A multi-sensor and multi-scenario slam dataset for ground robots,” IEEE Robot. Autom. Lett. 7(2), 22662273 (2021).CrossRefGoogle Scholar
Carlevaris-Bianco, N., Ushani, A. K. and Eustice, R. M., “University of michigan north campus long-term vision and LiDAR dataset,” Int. J. Robot. Res. 35(9), 10231035 (2016).CrossRefGoogle Scholar
Smith, M., Baldwin, I., Churchill, W., Paul, R. and Newman, P., “The new college vision and laser data set,” Int. J. Robot. Res. 28(5), 595599 (2009).CrossRefGoogle Scholar
Liu, Y., Fu, Y., Qin, M., Xu, Y., Xu, B., Chen, F., Goossens, B., Yu, H., Liu, C., Chen, L., et al., “BotanicGarden: A High-Quality Dataset for Robot Navigation in Unstructured Natural Environments,IEEE Robot. Autom. Lett. 9(3), 27982805 (2024), doi: 10.1109/LRA.2024.3359548.CrossRefGoogle Scholar