SLI-SLAM: Autonomous Navigation and Accurate Mapping for Quadruped Robot in Complex Environments Using LiDAR, Stereo Camera, and IMU Fusion
The proposed algorithm, SLI-SLAM (Stereo Camera-LiDAR-Inertial Measurement unit Fusion SLAM), introduces a comprehensive multi-sensor fusion framework for achieving navigation capabilities in quadruped robots. It combines stereo cameras, LiDAR, and IMU sensors using the factor graph framework for simultaneous localization and mapping (SLAM). It incorporates three types of odometry (LiDAR, Visual, and IMU) and effectively fuse their data to improve system performance. The algorithm enables autonomous perception, navigation, obstacle avoidance, and path planning. During autonomous naviga-tion, the robot continuously perceives the environment, updates its position, and dynamically adjusts the planned path to navigate safely around obstacles. The SLI-SLAM algorithm, in combination with the NDT and GMapping algorithms, facilitates real-time map updates, allowing for adaptation to environmental changes. The SLI-SLAM algorithm has been extensively validated on the popular KITTI dataset and in real-world environments. Compared to other SLAM approaches in complex scenarios, the com-parative analysis demonstrates its accuracy and robustness. Despite challenges in implementation, such as sensor calibration, synchronization, noise, and computational complexity, the SLI-SLAM algorithm has successfully been applied to quadruped robots and extensively tested, confirming its effectiveness and practicality. The SLI-SLAM algorithm enhances robustness, real-time performance, and the integration of environmental information, leading to accurate localization and mapping in complex scenarios.
Jan 1, 2024