As hardware platforms mature and evolve to contain higher compute capacity, Small Unmanned Aerial Systems (sUAS) are increasingly capable of operating as fully-integrated, cooperative inspection systems. A variety of lightweight sensing payloads are emerging for efficient multi-modal data collection. Deep learning algorithms applied to this sensor data significantly reduce the burden on system operators and enable the fusion of data from multiple sources for enhanced decision making.
The Air Force Civil Engineer Center (AFCEC) and TORC Robotics are developing a Rapid Airfield Damage Assessment System (RADAS) that uses simultaneous data streams from multiple sUAS and ground sensors for computer-aided condition assessment and planning of airfield repair. Operators, aided by intelligent algorithms, remotely monitor incoming data and software tools to identify a Minimum Airfield Operating Surface (MAOS).
Recent developments by AFCEC and TORC use deep learning algorithms to eliminate the bottleneck of human-in-the-loop interpretation of multiple simultaneous data sources. These advances provide a supervised autonomous workflow in: (a) identification of damages from multiple incoming sUAS video streams, (b) automated tasking of decisions based on that data, and (c) adjustment of decisions based on additional incoming information.
Preliminary results demonstrate significant reduction in airfield assessment time, increased assessment accuracy, and remove humans from danger during the inspection process. This work is part of the RADAS program funded by the Air Force Civil Engineering Center (AFCEC).
|