ICUAS'17 Paper Abstract


Paper WeA5.4

Paredes VallÚs, Federico (TU Delft), Magree, Daniel (Georgia Institute Of Technology), Johnson, Eric N. (Georgia Institute Of Technology)

Direct Feature Correspondence in Vision-Aided Inertial Navigation for Unmanned Aerial Vehicles

Scheduled for presentation during the "UAS Navigation - I" (WeA5), Wednesday, June 14, 2017, 11:00−11:20, San Marco Island

2017 International Conference on Unmanned Aircraft Systems, June 13-16, 2017, Miami Marriott Biscayne Bay, Miami, FL,

This information is tentative and subject to change. Compiled on April 12, 2021

Keywords Navigation, Sensor Fusion, Autonomy


This paper proposes a novel method for corresponding visual measurements to map points in a visual-inertial navigation system. The algorithm is based on the minimization of the photometric error on sparse locations of the image region, and realizes a gain in robustness that comes from the elimination of the need of feature-extraction for correspondence. The system is compared to a standard approach based on feature extraction, within a visual-inertial EKF formulation. High-fidelity simulation results show the proposed method improves the horizontal RMS error by means of increasing the number of features corresponded by the algorithm.



All Content © PaperCept, Inc.

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2021 PaperCept, Inc.
Page generated 2021-04-12  04:43:10 PST  Terms of use