Skip to content

slightech/MYNT-EYE-MapLab-Sample

Repository files navigation

MapLab with mynteye camera

If you want run Maplab with mynteye camera, please follow the steps:

  • Download MYNT-EYE-S-SDK and follow steps to install.
  • Calibrate cameras with kalibr.
  • Record bag with mynt_eye_ros_wrapper in order to run ROVIOLI.

Record bag

cd MYNT-EYE-S-SDK
make ros
source wrappers/ros/devel/setup.bash
roslaunch mynt_eye_ros_wrapper display.launch
rosbag record -O mynt_maplab /mynteye/left/image_raw /mynteye/right/image_raw /mynteye/imu/data_raw

Run ROVIOLI

Two modes exists for running ROVIOLI in VIO mode:

  • Building from a rosbag
  • Building from a rostopic

Requirements

Building a map from a rosbag

For this tutorial, we build a map from one of the mynt datasets. Go to downlaod maplab_ex.bag, password: 9pkp

source ~/maplab_ws/devel/setup.bash
roscore&
rosrun rovioli tutorial_mynt_stereo_pinhole_equ mynt_stereo maplab_ex.bag

Then, use rviz to display.

cd ~/maplab_ws/src/maplab/applications/rovioli/mynteye
rosrun rviz rviz -d rviz-rovioli.rviz

Building a map from a rostopic

source ~/maplab_ws/devel/setup.bash
roscore&
rosrun rovioli tutorial_mynt_live_stereo_pinhole_equ mynt_stereo_live

Then, in a separate terminal, start your data source:

rosbag play maplab_ex.bag  # or start your sensor node
cd ~/maplab_ws/src/maplab/applications/rovioli/mynteye
rosrun rviz rviz -d rviz-rovioli.rviz

Ubuntu 14.04+ROS indigo and Ubuntu 16.04+ROS kinetic: Build Status

News

  • May 2018: maplab was presented at ICRA in Brisbane.
  • March 2018: Check out our release candidate with improved localization and lots of new features! PR

Description

This repository contains maplab, an open, research-oriented visual-inertial mapping framework, written in C++, for creating, processing and manipulating multi-session maps. On the one hand, maplab can be considered as a ready-to-use visual-inertial mapping and localization system. On the other hand, maplab provides the research community with a collection of multi-session mapping tools that include map merging, visual-inertial batch optimization, and loop closure.

Furthermore, it includes an online frontend, ROVIOLI, that can create visual-inertial maps and also track a global drift-free pose within a localization map.

For documentation, tutorials and datasets, please visit the wiki.

Please also check out our video:

https://www.youtube.com/watch?v=CrfG4v25B8k

Features

Robust visual-inertial odometry with localization

Large-scale multisession mapping and optimization

Dense reconstruction

A research platform extensively tested on real robots

Installation and getting started

The following articles help you with getting started with maplab and ROVIOLI:

More detailed information can be found in the wiki pages.

Research Results

The maplab framework has been used as an experimental platform for numerous scientific publications. For a complete list of publications please refer to Research based on maplab.

Citing

Please cite the following paper when using maplab or ROVIOLI for your research:

@article{schneider2018maplab,
  title={maplab: An Open Framework for Research in Visual-inertial Mapping and Localization},
  author={T. Schneider and M. T. Dymczyk and M. Fehr and K. Egger and S. Lynen and I. Gilitschenski and R. Siegwart},
  journal={IEEE Robotics and Automation Letters},
  year={2018},
  doi={10.1109/LRA.2018.2800113}
}

Additional Citations

Certain components of maplab are directly using the code of the following publications:

  • Localization:
    @inproceedings{lynen2015get,
      title={Get Out of My Lab: Large-scale, Real-Time Visual-Inertial Localization.},
      author={Lynen, Simon and Sattler, Torsten and Bosse, Michael and Hesch, Joel A and Pollefeys, Marc and Siegwart, Roland},
      booktitle={Robotics: Science and Systems},
      year={2015}
    }
  • ROVIOLI which is composed of ROVIO + maplab for map building and localization:
    @article{bloesch2017iterated,
      title={Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback},
      author={Bloesch, Michael and Burri, Michael and Omari, Sammy and Hutter, Marco and Siegwart, Roland},
      journal={The International Journal of Robotics Research},
      volume={36},
      number={10},
      pages={1053--1072},
      year={2017},
      publisher={SAGE Publications Sage UK: London, England}
    }

Credits

  • Thomas Schneider
  • Marcin Dymczyk
  • Marius Fehr
  • Kevin Egger
  • Simon Lynen
  • Mathias Bürki
  • Titus Cieslewski
  • Timo Hinzmann
  • Mathias Gehrig

For a complete list of contributors, have a look at CONTRIBUTORS.md

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published