ICUAS'17 Paper Abstract


Paper ThB2.2

Ramani, Aditya (The University of Texas at Arlington.), Sevil, Hakki Erhan (The University of Texas at Arlington Research Institute (UTARI)), Dogan, Atilla (The University of Texas at Arlington)

Determining Intruder Aircraft Position Using Series of Stereoscopic 2-D Images

Scheduled for presentation during the "See-and-avoid Systems - I" (ThB2), Thursday, June 15, 2017, 14:05−14:25, Salon AB

2017 International Conference on Unmanned Aircraft Systems, June 13-16, 2017, Miami Marriott Biscayne Bay, Miami, FL,

This information is tentative and subject to change. Compiled on April 12, 2021

Keywords See-and-avoid Systems


The aim of this study is to investigate methods for computing the position of an intruder aircraft relative to an observer aircraft with onboard stereo cameras. To focus on relative position estimation rather than the intruder aircraft detection through image processing, the first phase is to generate camera images given the relative position information. This process uses a simple pinhole camera method where cameras are characterized by focal length, angle of view, and resolution. The second phase is to develop two methods to estimate the relative position based on the generated camera images. Both methods employ epipolar geometry of stereo vision based on two cameras placed on the aircraft with lateral separation. Various cases are run in a Matlab/Simulink simulation environment. Simulation cases are designed to evaluate the relative position estimation methods with different aircraft trajectories, different camera separations, and different camera resolutions. Simulation results show that relative position can be estimated while both aircraft are flying along any trajectories as long as the intruder aircraft is visible by both cameras. The estimation accuracy degrades as the relative distance between the aircraft increases. The larger lateral separation seems to improve the estimation accuracy. Image resolution seems to have little to no impact on estimation accuracy.



All Content © PaperCept, Inc.

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2021 PaperCept, Inc.
Page generated 2021-04-12  06:04:45 PST  Terms of use