Autonomous Quadcopter Navigation and Control

Robust Indoor Navigation and Object Detection using Stereo Camera and LiDAR

Abstract

This project involves the development of an autonomous navigation system for a quadcopter in simulated environments. The system integrates sensor fusion for localization, PID-based flight control, and dynamic path planning to enable the drone to traverse complex indoor environments. Leveraging the ROS 2 ecosystem and Gazebo simulation, the quadcopter performs autonomous navigation and collision-free movement.

1. Introduction

Autonomous aerial navigation is a critical component for applications ranging from indoor inspection to search and rescue. Unlike ground-based platforms, quadcopters must manage stability across six degrees of freedom while perceiving obstacles in 3D space. This project focuses on implementing an autonomous flight stack that coordinates flight dynamics with high-level mission planning using ROS 2.

2. System Overview

Quadcopter Design
Fig. 1. Quadcopter platform used for simulation and testing

3. Methodology

3.1 Environment and Flight Setup

3.2 Stereo Camera Calibration

3.3 Disparity Maps

Feature Matching using stereo camera
Fig. 3. Feature Matching using a stereo camera

3.4 Cartographer SLAM

SLAM Mapping and Dispiraity
Fig. 4. Cartographer SLAM and Disparity Map

3.5 Autonomous Navigation