Authors: Vlad Niculescu, Tommaso Polonelli, Michele Magno, Luca Benini
Corresponding author: Vlad Niculescu [email protected]
Perceiving and mapping the surroundings are essential for autonomous navigation in any robotic platform. The algorithm class that enables accurate mapping while correcting the odometry errors present in most robotics systems is Simultaneous Localization and Mapping (SLAM). Today, fully onboard mapping is only achievable on robotic platforms that can host high-wattage processors, mainly due to the significant computational load and memory demands required for executing SLAM algorithms. For this reason, pocket-size hardware-constrained robots offload the execution of SLAM to external infrastructures. To address the challenge of enabling SLAM algorithms on resource-constrained processors, this paper proposes NanoSLAM, a lightweight and optimized end-to-end SLAM approach specifically designed to operate on centimeter-size robots at a power budget of only 87.9 mW. We demonstrate the mapping capabilities in real-world scenarios and deploy NanoSLAM on a nano-drone weighing 44 g and equipped with a novel commercial RISC-V low-power parallel processor called GAP9. The algorithm, designed to leverage the parallel capabilities of the RISC-V processing cores, enables mapping of a general environment with an accuracy of 4.5 cm and an end-to-end execution time of less than 250 ms.
Our video briefly explains how our system works and showcases NanoSLAM operating onboard a 44 g nano-drone. Video available here.
If you use NanoSLAM in an academic or industrial context, please cite the following publication:
- NanoSLAM: Enabling Fully Onboard SLAM for Tiny Robots IEEE IoT Journal
@article{niculescu2023nanoslam,
title={NanoSLAM: Enabling Fully Onboard SLAM for Tiny Robots},
author={Niculescu, Vlad and Polonelli, Tommaso and Magno, Michele and Benini, Luca},
journal={IEEE Internet of Things Journal},
year={2023},
publisher={IEEE}
}
This work was developed using the following hardware setup:
- The commercial nano-drone platform Crazyflie 2.1
- The GAP9 parallel processor developed by Greenwaves Technologies and interfaced with our drone using the PCB introduced in this work
- Our custom deck provided in this repo (quad-tof-deck/) which features four VL53L5CX sensors. The tiny sensor boards that get attached to the deck are the same as in this project
- The commercial Flow-Deck v2
- Clone this repository:
git clone [email protected]:ETH-PBL/NanoSLAM.git
- Build and flash the STM32 MCU following the instructions from here
- Build and flash the GAP9 SoC following the instructions from here
Running the code on GAP9 is required every time the drone boots. After the steps above are performed, the mission can started by:
- Connect to the drone using the Crazyflie Client (can be installed from here)
- Go to the Parameters tab
- Set the parameter cmds.ready to 1
We understand that replicating our setup is non-trivial and we want to allow more users to use our code. Therefore, we will soon release a version of NanoSLAM that runs on a computer in C or through a Python binding. In this way, the NanoSLAM will be easy to integrate within other projects or even ported to embedded computers such as Xavier or Jetson. While the parallelization advantages offered by the GAP9 will be lost, it should still be faster compared to many existing alternatives.