RGB-D SLAM for MAV Autonomous Indoor Navigation

The advancement of robotics technology has enabled autonomous execution of many dangerous and complicated tasks. In particular, micro-aerial-
vehicles (MAVs) have gathered interests from different fields: surveillance, search and rescue, remote sensing and photography. However, most of such missions have been conducted outdoor where GPS data is available for localization and control. Maneuvering an MAV indoor autonomously, where GPS is unavailable, remains a challenging problem formulated as Simultaneous Localization and Mapping (SLAM).

The purpose of this study is to develop a RGB-Depth (RGB-D) camera based system to achieve SLAM and demonstrate the capability of MAVs
to autonomously navigate and execute missions indoor. Our SLAM algorithm uses a feature-based approach for visual odometry and a pose graph + loop closure based method for renement of the motion estimate. We construct the 3D map based on the rened position estimate from pose graph and the corresponding frame.

In addition, we apply sensor fusion of the SLAM position estimate and IMU, optical flow and range finder position estimate to implement on an MAV for autonomous mission. (In this demo, the MAV goes around four walls of poster, find a predefined poster and then land).