REDUAS 2019 Paper Abstract


Paper TuD24T2.4

Cocoma-Ortega, José Arturo (Instituto Nacional de Astrofísica, Óptica y Electrónica), Rojas-Perez, Leticia Oyuki (INAOE), Cabrera Ponce, Aldrich Alfredo (Instituto Nacional de Astrofisica, Optica y Electronica), Martinez-Carranza, Jose (Instituto Nacional de Astrofisica Optica y Electronica)

Overcoming the Blind Spot in CNN-Based Gate Detection for Autonomous Drone Racing

Scheduled for presentation during the Regular Session "Smart Sensors" (TuD24T2), Tuesday, November 26, 2019, 12:40−13:00, Room T2

2019 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED UAS), November 25-27, 2019, Cranfield University, Cranfield, UK

This information is tentative and subject to change. Compiled on January 20, 2022

Keywords Autonomy, Navigation, Micro- and Mini- UAS


In recent years Autonomous Drone Racing has become a significant challenge due to the problems involved in developing an algorithm for autonomous navigation. One of the major problems is the estimation of the camera pose; several approaches can be founded to achieve the estimation of the camera pose. In particular, it is possible to estimates the position based on specific object detection. However, object detection at the same time of navigation entails the problem of a blind spot area when the camera is closest to the object. We propose a methodology that overcomes the blind spot in autonomous navigation based on CNN gate detection to perform pose estimation with a stochastic algorithm for distance estimation. We achieve over 95 % in gate detection and a mean error of around 35 cm in 1D pose estimation into the blind spot zone.



All Content © PaperCept, Inc.

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2022 PaperCept, Inc.
Page generated 2022-01-20  02:56:07 PST  Terms of use