Paper
3 September 1993 Real-time computational needs of a multisensor feature-based range estimation method
Raymond E. Suorsa, Banavar Sridhar, Terrence W. Fong
Author Affiliations +
Abstract
The computer vision literature describes many methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. Methods may be broadly categorized into field-based techniques and feature-based techniques. Field-based techniques have the advantage of regular computational structure at every pixel throughout the image plane. Feature-based techniques are much more data driven in that computational complexity increases dramatically in regions of the image populated by features. It is widely believed that to run computer vision algorithms in real time a parallel architecture is necessary. Field-based techniques lend themselves to easy parallelization due to their regular computational needs. However, we have found that field-based methods are sensitive to noise and have traditionally been difficult to generalize to arbitrary vehicle motion. Therefore, we have sought techniques to parallelize feature-based methods. This paper describes the computational needs of a parallel feature-based range-estimation method developed by NASA Ames. Issues of processing-element performance, load balancing, and data-flow bandwidth are addressed along with a performance review of two architectures on which the feature-based method has been implemented.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Raymond E. Suorsa, Banavar Sridhar, and Terrence W. Fong "Real-time computational needs of a multisensor feature-based range estimation method", Proc. SPIE 1956, Sensor Fusion and Aerospace Applications, (3 September 1993); https://doi.org/10.1117/12.155096
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image processing

Detection and tracking algorithms

Sensor fusion

Aerospace engineering

IRIS Consortium

Sensors

Image fusion

Back to Top