REDUAS 2019 Paper Abstract

Close

Paper MoD13T1.1

Stamenkovich, Joseph (Virginia Tech), Maalolan, Lakshman (Virginia Tech), Patterson, Cameron (Virginia Tech)

Formal Assurances for Autonomous Systems without Verifying Application Software

Scheduled for presentation during the Regular Session "Autonomy" (MoD13T1), Monday, November 25, 2019, 11:40−12:00, Room T1

2019 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED UAS), November 25-27, 2019, Cranfield University, Cranfield, UK

This information is tentative and subject to change. Compiled on April 15, 2024

Keywords Reliability of UAS, Security, Levels of Safety

Abstract

Our ability to ensure software correctness is especially challenged by autonomous systems. In particular, the use of artificial intelligence can cause unpredictable behavior when encountering situations that were not included in the training data. We describe an alternative to static analysis and conventional testing that monitors and enforces formally specified properties describing a system's physical state. All external inputs and outputs are monitored by multiple parallel automata synthesized from guards specified as linear temporal logic (LTL) formulas capturing application-specific correctness, safety, and liveness properties. Unlike conventional runtime verification, adding guards does not impact application software performance since the monitor automata are implemented in configurable hardware. In order to remove all dependencies on software, input/output controllers and drivers may also be implemented in configurable hardware. A reporting or corrective action may be taken when a guard is triggered. This architecture is consistent with the guidance prescribed in ASTM F3269-17, Methods to Safely Bound Behavior of Unmanned Aircraft Systems Containing Complex Functions. The monitor and input/output subsystem's minimal and isolated implementations are amenable to model checking since all components are independent finite state machines. Because this approach makes no assumptions about the root cause of deviation from specifications, it can detect and mitigate: malware threats; sensor and network attacks; software bugs; sensor, actuator and communication faults; and inadvertent or malicious operator errors. We demonstrate this approach with rules defining a virtual cage for a commercially available drone.

 

 

All Content © PaperCept, Inc.

This site is protected by copyright and trademark laws under US and International law.
All rights reserved. © 2002-2024 PaperCept, Inc.
Page generated 2024-04-15  22:47:33 PST  Terms of use