VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach
We investigate a comprehensive sensor fusion scheme to integrate an extensive set of sensor data, namely stereo camera images, 6 DoF IMU data, body-offset range measurements, and multiple lidar odometry outputs. An optimization framework is developed to estimate the robot position-orientation-velocity states from IMU preintegration, attitude-coupled UWB ranges, and Onboard Self-Localization observations. Thanks to the use of body-offset ranging technique, the VIRAL method can output global localization information in both positional and rotational states. We demonstrate the efficacy and effectiveness of the method via a wide range of experiments: public datasets, high-fidelity graphical-physical simulator, and real world data.
–>