Integration of RTK, IMU and visual SLAM for Outdoor Robot Navigation
Accurate position and pose estimation are the foundations of autonomous robotics, which usually rely on the fusion of multiple sensors. In this project, we introduced a sensor fusion method based on an extended Kalman filter using Robot Operating System (ROS) for robot outdoor localization, which supports many sensors of different types and can be used both in real-time and offline. Stereo visual SLAM, an IMU and a dual-frequency RTK GNSS on the robot are fused in the experiments to achieve robust and precise ego-motion tracking. Loop closure localization accuracy using different combinations of sensors are evaluated and analysed. An average error of less than a centimetre is achieved by fusing these three sensors.