A model-based multi-sensor fusion framework has previously been developed that supports improved target recognition
by fusing target signature information obtained from sensor imagery [1], [2]. Image- based signature features, however,
are not the only source of information that may be exploited to advantage by a target recognition system. This paper
presents a review of the key features of the model-based fusion framework and shows how it can be expanded to
support information derived from imaging sensors as well as data from other non-imaging sources. The expanded
model-based multi-source framework supports not only the combination of image data, such as Synthetic Aperture
Radar (SAR) and electro-optical (EO), but also various types of non-image data that may be derived from those, or
other sensor measurements. The paper illustrates the flexibility of the model-based framework by describing the
combination of spatial information from an imaging sensor with scattering characteristics derived from polarimetric
phase history data. The multi-source fusion is achieved by relating signature features to specific structural elements on
the 3-D target geometry. The 3-D model is used as a sensor neutral, view independent, common reference for the
combination of multi-source information.
Synthetic aperture radar (SAR) is an all weather sensor that has provided breakthrough remote sensing capabilities for
both civilian and military applications. SAR differs from other real-aperture sensors in that it achieves fine resolution
using signal processing techniques that are based on certain assumptions about the relative dynamics between the sensor
and the scene. When these assumptions are violated, the quality of the SAR imagery degrades, impacting its
interpretability.
This paper describes the development of a simulation testbed for evaluating the effects of SAR-specific error sources on
image quality, including effects that originate with the sensor (e.g. system noise, uncompensated motion), as well as
effects that originate in the scene (e.g. target motion, wind-blown trees). The simulation generates synthetic video
phase history and can accommodate a variety of sensor collection trajectories, acquisition geometries, and image
formation options. The simulation approach will be described, example outputs will be shown, and initial results
relating simulation inputs to image quality measures will be presented.
The MSTAR automatic target recognition (ATR) system recognizes targets by matching features predicted from a CAD model against features extracted from the unknown signature. In addition to generating signature features with high fidelity, the online Predictor in the MSTAR system must provide information that assists in efficient search of the hypothesis space as well as accounting for uncertainties in the prediction process. In this paper, we describe two capabilities implemented in the MSTAR Predictor to support this process. The first exploits the inherent traceback between predicted features and the CAD model that is integral to the predictor to enable component-wise scoring of candidate hypotheses. The second is the generation of probability density functions that characterize the fluctuation of amplitudes in the predicted signatures. The general approach for both of these is described, and example results are presented.
We analyze the use of the beta distribution for the statistical characterization of the radar cross-section (RCS) of a complex target. Analysis consists of first generalizing a complex target as a set of component scatterers, each with a constant component RCS and a phase characterized by a uniform random variable. From this set of target-based component scatterers, estimates of the moments of the implied probability distribution function (pdf) on the RCS response of the full target are gathered, and used to fit a beta distribution. Two distinct methods of fitting the beta distribution are compared against the results of Monte-Carlo analysis over a variety of component scatterer sets. This comparison leads to estimates of the accuracy of each method of generating moments for the fitting of the beta distribution, and further, leads to the characterization of pathological cases for the use of the beta distribution in modeling complex target RCS. Resulting methods for the modeling of the RCS of a complex target are discussed in the context of model-based SAR ATR applications.
Fundamental to the model-based paradigm of an Automatic Target Recognition (ATR) system is an accurate representation (a model) of the physical objects to be recognized. Detailed CAD models of targets of interest can be created using photographs, blueprints, and other intelligence sources. When created this way, the target CAD models are necessarily specific to a particular realization of the vehicle (namely, the serial number of the vehicle from which the CAD model was validated). Under realistic battlefield conditions, variations across targets of the same type (i.e. T72) may be quite drastic and may manifest themselves as significant differences in the sensor signatures. Given this variability between targets of the same type, the example CAD model, or 'exemplar' model, may not provide an adequate representation of the vehicle across the entire class. This paper discusses the development of class models for use in a model-based ATR for synthetic aperture radar (SAR). It documents the propagation of variability information into feature uncertainty, and comments on the performance of class models in the Moving and Stationary Target Acquisition and Recognition (MSTAR) model- based ATR system.
We examine the use of mean squared error matching metrics in support of model-based automatic target recognition under the Moving and Stationary Target Acquisition and Recognition (MSTAR) program. The utility of this type of matching metric is first examined in terms of target discriminability on a 5-class problem, using live signature data collected under the MSTAR program and candidate target signature features predicted from the MSTAR signature feature prediction (MSTAR Predict) module. Analysis is extended to include the exploitation of advanced model-based candidate target signature feature prediction capabilities of MSTAR Predict, made possible by the use of probability distribution functions to characterize target return phenomenology. These capabilities include the elimination of on-pose scintillation effects from predicted target signature features and the inclusion of target pose uncertainty and intra-class target variability into predicted target signature features. Results demonstrating the performance advantages supported by these capabilities are presented.
ERIM has developed a highly accurate method of simulating ocean backgrounds and targets in the infrared called Ship and Ocean Surface Image Simulation (SOSIS). This package provides realistic IR images of the ocean surface and horizon as a function of sea state, atmospheric conditions, sensor properties, and viewing geometry. SOSIS uses as inputs a thermal target model, an ocean surface model, and an environmental model. It samples these three models using a system of ray tracing and rough surface scattering to account for the interactions of the ocean surface, target, and the environment. At each ray-surface intersection, SOSIS takes into account small surface roughness, bi-directional reflection functions (BRDF's), solar glint, and polarization. The imagery from the SOSIS package is used for target signature prediction, weapons design, automatic target recognition (ATR) development, IR search and track (IRST) testing, ocean-horizon signatures, and many other functions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.