ICUAS 2020 Paper Abstract

Close

Paper ThB1.4

Astudillo Olalla, Armando (Universidad Carlos III de Madrid), Al-Kaff, Abdulla (Universidad Carlos III de Madrid), Madridano, Angel (Universidad Carlos III de Madrid), García Fernández, Fernando (Universidad Carlos III de Madrid), Martín Gómez, David (Universidad Carlos III de Madrid), de La Escalera, Arturo (Universidad Carlos III de Madrid)

Mono-LSDE: Lightweight Semantic-CNN for Depth Estimation from Monocular Aerial Images

Scheduled for presentation during the Regular Session "See and Avoid Systems II" (ThB1), Thursday, September 3, 2020, 16:00−16:20, Macedonia Hall

2020 International Conference on Unmanned Aircraft Systems (ICUAS), September 1-4, 2020 (Postponed from June 9-12, 2020), Athens, Greece

This information is tentative and subject to change. Compiled on September 25, 2020

Keywords See-and-avoid Systems, Smart Sensors, UAS Applications

Abstract

In the last decade, with the advances in autonomous technologies, Unmanned Aerial Vehicles (UAVs) have been encountered a significant focus on several applications. With the complexity of the tasks performed by the UAVs, this addresses the necessity to obtain information about the surrounding environment. Estimating depth maps from monocular images is considered a key role when working small or micro UAVs, this is due to the Size, Weight, and Power (SWaP) constraints on these vehicles. Therefore, this paper proposed a lightweight Semantic Neural Network based on Encoder-Decoder architecture; to obtain a depth map from a monocular camera.

The proposed method has been tested in several scenarios of complex environments, and the obtained results show its robustness and efficiency against different weather and light conditions, illustrating the functionality of the proposed method in real-time applications.

 

 

All Content © PaperCept, Inc.

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2020 PaperCept, Inc.
Page generated 2020-09-25  16:43:52 PST  Terms of use