Particle filter meets hybrid octrees: an octree-based vehicle localization approach without learning (Submitted by ESIGELEC)
-
2020/07/17 : A new dataset is availaible here on same road than previous.
-
2020/06/18 : A new dataset is availaible here on same road than previous.
-
2020/05/15 : A new dataset is availaible here on same road than previous.
-
April 2020 : Due to the COVID19, no dataset were done in appril.
-
2020/03/17 : A second dataset is availaible here on same road than previous.
-
2020/02/18 : A first dataset is availaible (dataset used for paper) here.
This paper presents a 6 DoF (Degree of Freedom) real-time vehicle localization based on hybrid octrees and particle filtering of LiDAR data. Our approach is based on two lightweight frameworks. It makes it possible to reduce the memory footprint of the map and significantly lowers the computation load for online localization. Our approach has shown to perform well on both CPUs and GPUs. The algorithm design makes it possible to run the localization simultaneously on both architectures. Our localization method is LiDAR agnostic. Our experiments have been carried out with two distinctive LiDAR technologies: scanning LiDAR and flash LiDAR. Our extensive experimental validation shows that our method is both accurate and reliable on several datasets, platforms and environments. We have especially shown that the same localization algorithms and parameters can perform well in urban and offroad environments. We have also evaluated the robustness of our method when masking angular sectors of the LiDAR field of view. The performance achieved with the flash LiDAR is close to the scanning LiDAR despite different resolutions and sensing modalities. The positioning performance is significant with 10cm and 0.12$\degree$ angular RMSE for both technologies. We evaluated our approach on the KITTI dataset and achieved fair results with respect to the state of the art. This paper also introduced the baseline performance on a multi-seasonal dataset we are publicly releasing to the community. We validated our approach in an off-road environment from a front view field of view with only 768 LiDAR points.
Vincent VAUCHEY¹, Pierre MERRIAUX², Xavier SAVATIER³, Yohan DUPUIS⁴.
¹ESIGELEC , IRSEEM, Rouen, France, Normandie Univ, UNIROUEN,
²Leddartech, Quebec City, Canada
³ESIGELEC , NAVYA, Paris, France
⁴CESI , CESI, La Défense, Paris, France,
[email protected] [email protected] [email protected] [email protected] [email protected]
Special Thanks to the members of the SIRD team : Marc DEHAIS, Anthony DESHAIS, Christophe ALEGRE, Pascal FALLA, Jérémy FOURRE
Dataset Lidar/IMU/RGBD Camera done by ESIGELEC.
Some new dataset on same roads will be availaible each mounth.
The main reference dataset for lidar localisation is KITTY dataset and is used for a lot of paper and result comparaisons beetweens localization/SLAM algorithms. The main difficultiy to use KITTY dataset is the ground true accuracy which is many times more than 20 cm.
We try to provide to the community some datasets done with a reference position with best positioning systems avaiblaibles in all conditions. The datasets ground True is done with a Landyns (IXblue) IMU based on fiber-optic-gyroscope (FOG), which is a technology wich ensure very low shift and noise between two gps position or when the gps accurancy become low. In addition of this very high accurate IMU, we're also using a postprocessing application (APPS) coupled whith a GPS RTK septentrio . The goal of this application is to increase the accuracy of ground true position especially when there is tree/tunnel by doing forward/backward Kallman Filter. Landyn IMU is an "old" IMU without new firmware release but with better FOG than IMU currently in sale for civil application by IXblue. The fact to use APPS software give the possibility to use processing algorithms used on the last IMU in sale by (IXblue) on our Landyn.
IMU | LANDYN (IXblue) | ATLANS (IXblue) | RT 3000 (Oxts) |
---|---|---|---|
Position when RTK lost (m) | 0.20 (GNSS outage 60s) | 0.350 (GNSS outage 60s) | 1.5 (SPS) |
Pitch/Roll (deg) | 0.005 (RTK) / 0.005 (GNSS outage 60s) | 0.008 (RTK) / 0.01 (GNSS outage 60s) | 0.03 |
Heading (deg) | 0.01 | 0.020 | 0.1 |
-
Loop1 : Winter sun and some clouds (~1.5km)
-
Loop2 : Winter sun and some clouds with one short tunnel (~2.6 km)
Images will be availaible as soon as the containment due to the COVID19 will be finish.
-
Loop1 : Winter sun (~1.5km)
-
Loop2 : Winter sun (~2.6 km) {: #funky }
-
Loop1 : spring (~1.5km)
- 30 km/h dataset A (Download)
- 30 km/h dataset B (Download)
- 40 km/h dataset (Download)
- 50 km/h dataset (Download)
- Directory Tree and calibrations
maps preview Now no camera dataset will done due to the big amouth of data generated and the few interests of this kind of dataset.
-
Loop2 : spring (~1.5km)
- 30 km/h dataset (Download)
- 40 km/h dataset (Download))
- 50 km/h dataset (Download)
- Directory Tree and calibrations
maps preview Now no camera dataset will done due to the big amouth of data generated and the few interests of this kind of dataset.
-
Loop1 : spring (~1.5km)
-
Loop2 : spring (~1.5km)
-
Loop1 : spring (~1.5km)
-
Loop2 : spring (~1.5km)
List of sensors and software used :
- vlp16 Lidar synchronised on GPS (Velodyne)
- GPS AsterRx-U (septentrio)
- LANDYN IMU + post processing software APPS (IXblue)
- D435 trigged on IMU (intelrealsense)
- Peiseler odometer mounted on the right rear wheel.
- RTMAPS (Intempora) Realtime acquisition software (can also be used to replay datasets)
- Rtk correction network (teria)
- PCAN-USB (peak-system)
Directory Tree :
- lidarCorrectedSynchronisedImuPostPro.zip : lidar corrected + imu + postpro synchronised
- lidarUnCompensatedImuPostProUnsynchronised : lidar uncorrected + imu + postpro unsynchronised (Download)
- vehicleOdometry : vehicle longitudinal speed (m/s) and can yaw rate (r/s) timestamped but not calibrated.
Due to European Union RGPD limitation, images were removed.
Calibrations (X forward, Y left, Z Up) :
- Transformation IMU to Lidar (Tx,Ty,Tz,Rx,Ry,Rz) : [0.989,-0.024, 2.388,0.0,0.0,-0.385]
- Car odometry and IMU have the same measurement points (rear axle)
- Transformation lidar to rgbd Camera (Tx,Ty,Tz,Rx,Ry,Rz) : [0.74,-0.43, 0.0,0.0,0.0,0.0], python code to read XYZ png file is availaible here
- Camera intrinsic calibration availaible here
The dataset localization are datasets on same road than the French Project "Rouen Normandy Autonomous Lab" with Renault and Transdev partners: