This repo couples wheel encoder data and GPS data on the basis of VINS_Mono. The project is tested on KAIST dataset and is suitable for automatic driving scenario.
The wheel encoder data is tightly coupled, referred to the paper[1]. GPS fusion adopts loose coupling, and the fusion method is consistent with VINS-Fusion.
Detailed derivations can be found in: https://blog.csdn.net/ewtewtewrt/article/details/117249295 The method has tested in KAIST dataset (urban28-pankyo) video
Ubuntu 16.04. ROS Kinetic. ROS Installation additional ROS pacakge
sudo apt-get install ros-YOUR_DISTRO-cv-bridge ros-YOUR_DISTRO-tf ros-YOUR_DISTRO-message-filters ros-YOUR_DISTRO-image-transport
Follow Ceres Installation, remember to make install. (Our testing environment: Ubuntu 16.04, ROS Kinetic, OpenCV 3.1.0, Eigen 3.3.7)
Clone the repository and catkin_make:
cd ~/catkin_ws/src
git clone https://github.com/Wallong/VINS-GPS-Wheel.git
cd ..
catkin_make
source ~/catkin_ws/devel/setup.bash
The method is tested on KAIST dataset. https://irap.kaist.ac.kr/dataset/
Open four terminals, launch the vins_estimator, rviz and pubish the data file respectively. Take urban28-pankyo for example
roslaunch vins_estimator kaist.launch
rosrun multisensor_fusion multisensor_fusion_node (optional, for GPS)
rosrun vins_estimator kaist_pub YOUR_PATH_TO_DATASET/KAIST/urban28/urban28-pankyo
roslaunch vins_estimator vins_rviz.launch
Module | Status |
---|---|
Encoder Pre-integration | Done |
Initialization with encoder | Done |
Optimization with encoder | Done |
Online Extrinsic Calibration about encoder | Doing |
Loosely coupled with GNSS | Done |
Initialization with GNSS | Will do |
Tightly coupled with GNSS | Will do |
- J. Liu, W. Gao and Z. Hu, "Visual-Inertial Odometry Tightly Coupled with Wheel Encoder Adopting Robust Initialization and Online Extrinsic Calibration," 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2019, pp. 5391-5397, doi: 10.1109/IROS40897.2019.8967607.
For any issues, please feel free to contact Longlong Wang: [email protected]