ICUAS'17 Paper Abstract

Close

Paper ThA4.2

Papachristos, Christos (University of Nevada, Reno), Khattak, Shehryar (University of Nevada, Reno), Alexis, Kostas (University of Nevada, Reno)

Autonomous Exploration of Visually-Degraded Environments Using Aerial Robots

Scheduled for presentation during the "UAS Applications - IV" (ThA4), Thursday, June 15, 2017, 10:20−10:40, Lummus Island

2017 International Conference on Unmanned Aircraft Systems, June 13-16, 2017, Miami Marriott Biscayne Bay, Miami, FL,

This information is tentative and subject to change. Compiled on July 2, 2025

Keywords UAS Applications

Abstract

This paper presents a combined perception systems and planning algorithms approach to the problem of autonomous aerial robotic navigation and exploration in degraded visual (dark) GPS-denied environments. A perception system that comprises of a synchronized near-infrared stereo camera system, flashing LEDs, inertial sensors and a 3D depth sensor is utilized in order to derive visual-inertial odometry and dense mapping in conditions of complete darkness. Exploiting this ability within the framework of a localizability-aware receding horizon exploration and mapping planner, the proposed approach ensures robotic autonomy in dark environments for which no prior knowledge exists. A set of experimental studies in a dark room of complex geometry, as well as a city tunnel at night were conducted to evaluate and verify the abilities of the system and the proposed solution.

 

 

All Content © PaperCept, Inc.

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2025 PaperCept, Inc.
Page generated 2025-07-02  21:47:40 PST  Terms of use