Ted J. Broida
Proceedings Volume Sensor Fusion II: Human and Machine Strategies, (1990) https://doi.org/10.1117/12.969981
This paper addresses some of the performance issues encountered in the use of multiple sensors for surveillance and tracking in aerospace and defense applications. These problems generally involve detecting the presence of an unknown number of objects of interest (often referred to as "targets"), and estimating their position and motion from periodic measurements (tracking). Measurements zk = z(tk) are received from one or more sensors at times tk and are classified (labelled) as arising either from one of the objects currently being tracked, from a new or previously undetected object, or as a false alarm or clutter. Typically, measurement models involve nonlinear functions of true object kinematics in additive zero-mean white noise, z(tk) = h[x(tk)] n(tk). Estimates of object kinematics (e.g. position, velocity, acceleration) x are formed from each labelled measurement sequence {zk}, with the objective of keeping an accurate and complete awareness of the external environment. In addition to additive measurement noise, a number of uncertainties are present: (1) objects can maneuver between measurements ("random" acceleration), (2) some measurements (threshold crossings) are due to noise alone (false alarms), or due to non-zero-mean interference, with unknown spatial and temporal covariance ("clutter"), and can be misclassified as being from an object of interest, (3) some measurements actually from an object of interest can be misclassified as being from a different object, or as being noise or clutter, (4) object detection is not guaranteed, so that a sensor can "observe" a region containing an object but fail to detect it (PD < 1), whether or not it is being tracked, and (5) there are errors in knowledge of the relative position and attitude of different sensors, particularly if sensors are moving independently (different platforms). The functions of data association (labelling measurements from different sensors, at different times, that corre-spond to the same object or feature) and data fusion (combining measurements from different times and/or different sensors) are required in one form or another in essentially all multiple sensor fusion applications: one function determines what information should be fused, the other function performs the fusion. This paper presents approaches for quantifying the performance of these functions in the surveillance and tracking application. First, analytical techniques are presented that bound or approximate the fused kinematic estimation performance of multiple sen-sor tracking systems, in the absence of association errors. These bounds and approximations are based on several extensions of standard Kalman filter covariance analysis procedures, and allow modeling of a wide range of sensor types and arbitrary, time-varying geometries, both sensor-to-sensor and sensor-to-object. Arbitrarily many sensors can be used with varying update intervals, measurement accuracies, and detection performance. In heavy clutter or false alarm backgrounds it is often impossible to determine which (if any) of the measurements near a target track actually arise from the target, which leads to a degradation of tracking accuracy. This degradation can be estimated (but not bounded) with an approximate covariance analysis of the Probabilistic Data Association Filter (PDAF). Next, data association performance is quantified in terms of error probability for the case of closely spaced objects (CSOs) with minimal clutter, and for the case of isolated objects in a heavy clutter or false alarm background. These probabilities can be applied to data acquired by any sensor, based on measurement and track accuracies described by error covariance matrices. For example, in many applications a track established by one sensor is used to cue another sensor - in the presence of CSOs and/or clutter backgrounds, this approach can be used to estimate the probability of successful acquisition of the desired target by the second sensor.