Evaluating Multisensor-aided Inertial Navigation System (MINS)
1. Introduction
MINS is a multisensor sensor fusion slam system capable of fusing IMU, camera, LiDAR, GPS, and wheel sensors. My goal was to evaluate it on the
KAIST Urban Dataset as well as my lab's (at University of Washington) suburban rover dataset. I used
kaist2bag to convert the separate sensor bag files to one bag.
2. Sensor Calibration
I had to calibrate and configure the system to work with the various sensors in the KAIST setup and the rover setup respectively. The main sensor calibration parameters I had to configure were:
- Transformation matrices between sensors T_imu_camera, T_imu-wheel, T_imu_lidar
- Camera intrinsics, resolution
- IMU acceleration and bias, gyroscope acceleration and bias
- Wheel intrinsics (radius, wheel base)
Additionally, I had to download the segway_msgs and gnss_comm ROS message packages to enable our system to correctly read messages from our rover wheel encoder and gps.
3. Evaluation
MINS already had evaluation code that worked with minimal tuning and setup. Absolute Trajectory Error, a formulation of which can be found below, was used to evaluate the SLAM trajectories. I achieved a 9.12m ATE on an 11km KAIST Urban Trajectory and a 1.1m ATE on our 1km Lab rover trajectory. $$ ATE_{RMSE} = \sqrt{ \frac{1}{N} \sum_{i=1}^{N} \left\| P_{i}^{est} - P_{i}^{gt} \right\|^2 } $$

MINS Result on KAIST Urban dataset

MINS Result on Lab Rover Dataset

Complete MINS Trajectory of KAIST Urban28

Complete Ground Truth Trajectory of KAIST Urban28
You can compare MIN's SLAM trajectory with its corresponding ground truth trajectory above.