The Air Force Research Laboratory (AFRL) is studying the application and utility of various ground-based and space-based
optical sensors for improving surveillance of space objects in both Low Earth Orbit (LEO) and Geosynchronous
Earth Orbit (GEO). This information can be used to improve our catalog of space objects and will be helpful in the
resolution of satellite anomalies. At present, ground-based optical and radar sensors provide the bulk of remotely sensed
information on satellites and space debris, and will continue to do so into the foreseeable future. However, in recent
years, the Space-Based Visible (SBV) sensor was used to demonstrate that a synthesis of space-based visible data with
ground-based sensor data could provide enhancements to information obtained from any one source in isolation. The
incentives for space-based sensing include improved spatial resolution due to the absence of atmospheric effects and
cloud cover and increased flexibility for observations. Though ground-based optical sensors can use adaptive optics to
somewhat compensate for atmospheric turbulence, cloud cover and absorption are unavoidable. With recent advances in
technology, we are in a far better position to consider what might constitute an ideal system to monitor our surroundings
in space. This work has begun at the AFRL using detailed optical sensor simulations and analysis techniques to explore
the trade space involved in acquiring and processing data from a variety of hypothetical space-based and ground-based
sensor systems. In this paper, we briefly review the phenomenology and trade space aspects of what might be required in
order to use multiple band-passes, sensor characteristics, and observation and illumination geometries to increase our
awareness of objects in space.
In order to understand the phenomenology of optimum data acquisition and analysis and to
develop an understanding of capabilities, field measurements of multiband, polarimetric data can
substantially assist in developing a methodology to collect and to exploit feature signatures.
In 1999, Duggin showed that images obtained with an 8-bit camera used as a polarimeter could
yield additional information to that contained in a radiometric (S0) image. It should be noted that
Walraven and Curran had performed some very fine experiments almost two decades earlier,
using photographic film, and North performed careful polarimetric measurements of the
skydome using a four-lens polarimetric film camera and convex mirror in 1997. There have been
a number of papers dealing with polarimetric field measurements since that time. Recently,
commercial color cameras have become available that have 12-bit depth per channel. Here, we
perform radiometric and chromatic calibrations and examine the possible use of a Nikon D200
10.2 mega pixel, 3 channel, 12-bit per channel camera fitted with a zoom lens as a potential field
imaging polarimeter. We show that there are still difficulties in using off-the-shelf technology for
field applications, but list some reasons why we need to address these challenges, in order to
understand the phenomenology of data collection and analysis metrics for multiple data streams.
The increasing availability of multispectral, hyperspectral, and multisensor imagery during the past decade has motivated rapid growth in image fusion research for remote sensing application. While it is generally the goal of image fusion methods to obtain more information fiom the combination of multiple images than could be obtained from individual images, the measure of how well fused images actually achieve this goal is still largely subjective. Furthermore, in the
selection of image data, and the analytical procedures to process this data, we make a sequence of implicit assumptions that need to be reviewed. Metrics are used to specify image characteristics necessary to perform specified tasks. New metrics are necessary to characterize the performance of image fusion techniques and also to determine the extent to which these techniques may provide more useful information than could be derived fiom non-fused imagery. Without metics, we cannot predict what data we need, or how to collect and to analyze it. There is currently no metric that encompasses both spatial and spectral resolution characteristics. A metric describing the quality of polarimetric imagery is an example of the larger problem of metrics required to specify the necessary characteristics of fused,
multidimensional image data. Since polarimetric imagery is based upon the differences of image pairs obtained with the polarizer oriented orthogonally about the optic axis, misregistration introduces a false clutter that degrades information content of polarimetric imagery, so that a polarimetric image characteristic will depend upon registration accuracy. A General Image Quality Equation (GIQE) is a multivariate regression of the image quality metric against the independent imaging ammeters, such as registration in the case of polarimetric imagery. We need a General Image Quality Equation
(GIQE) for polarimetric images in which one regression term describes misregistration. We need image quality metrics for polarimetric, multi-dimensional and fused multidimensional image data. In this paper we shall consider what metrics are needed to design and collect data, what general principles should guide the collection of analysis of data, and we shall consider polarimetric imagery as a simple example of a type of image fusion and analysis.
KEYWORDS: Polarimetry, Sensors, Point spread functions, Polarizers, Target detection, Spatial resolution, Signal to noise ratio, Reflectivity, Calibration, Linear polarizers
The discrimination of scene elements in polarimetric and in non-polarimetric images is governed by both environmental and instrumental factors. These factors consist of systematic elements, which are dealt with by means of appropriate calibration, and random errors. In the case of imaging polarimetry, the Stokes parameter images are calculated from images obtained with orthogonal orientations of the linear polarizer about the optic axis. For the stokes images to contain significant information, the orthogonal, registered image pair from which the Stokes images S1 and S2 are calculated must be significantly different. Misregistration of the orthrogonal input images also impacts the correlation of the resulting Stokes image to scene elements. The system MTF, sampling patter and geometry further impact the discrimination of features in the scene. These factors are discussed. The effects of systematic and random error sources on resolved target discriminability from clutter background is considered in depth. While the issue of spatially unresolved target detection is considered, it does not form a major component of this discussion. The intent of these considerations of the physics and phenomenology of imaging polarimetry is to progress towards the predictive modeling of target discriminability. This will aid in sensor design and mission parameter optimization.
The automated, or semi-automated analysis of scene elements in a clutter background is more complex in polarimetric imaging than in conventional imaging. This is largely due to the fact that misregistration of the orthogonal images used to calculate the Stokes parameter images introduces an artificial clutter. Further, there is little reported information on polarimetric image clutter. We present representative findings from an analysis of polarimetric image data, obtained over various backgrounds with various geometries, and examine the manner in which systematic and random variations impact feature discriminations.
The factors governing the extraction of useful information from polarimetric images depend upon the image acquisition and analytical methodologies being used, and upon systematic and environmental variations present during the acquisition process. The acquisition process generally occurs with foreknowledge of the analysis to be used. Broadly, interactive image analysis and automated image analysis are two different procedures: in each case, there are technical challenges. Imaging polarimetry is more complex than other imaging methodologies, and produces an increased dimensionality. However, there are several potential broad areas of interactive (manual) and automated remote sensing in which imaging polarimetry can provide useful additional information. A review is presented of the factors controlling feature discrimination, of metrics that are used, and of some proposed directions for future research.
The integration and calibration of a hyperspectral imaging polarimeter is described. The system was designed to exploit subtle spectral details in visible and near-IR hyperspectral polarimetric images. All of the system components were commercial-off-the-shelf. This device uses a tunable liquid crystal filter and 16-bit cooled CCD camera. The challenges of calibrating a hyperspectral polarimeter are discussed.
A brief history of RADAR development is followed by an indication of the relevance of LIDAR to ranging and detection of targets. Initially, radiated laser power is discussed. Peak power of 100 kilowatts with a diode pumped solid-state laser appears feasible. Frequency control appears possible with atomic standards controlling the high power laser. Optical characterization of the polarization properties of lasers on targets is being pursued as well as the options. Coherence length of LASER radiation still poses a problem over ranges beyond one hundred meters. Target identification is enhanced using polarization with the aid of higher-resolution focal plane arrays. Coherence applications appear feasible in the near future.
Focal plane wide band IR imagery will be compared with visual wide band focal plane digital imagery of a camouflaged B-52 bomber. Extreme enhancement is possible using digital polarized imagery. The experimental observations will be compared to theoretical calculations and modeling result of both specular and shadowed areas to allow extrapolations to the synthesis of the optical polarization signatures of other aircraft. The relationship of both the specular and the shadowed areas to surface structure, orientation, specularlity, roughness, shadowing and the complex index of refraction will be illustrated. The imagery was obtained in two plane-polarized directions. Many aircraft locations were measured as well as sky background.
Relatively little work has been performed to investigate the potential of polarization techniques to provide contrast enhancement in natural scenes. Historically, this has been because film is less accurate radiometrically than digital CCD FPA sensing devices. Such enhancement is additional to that provided by between-band differences for multiband data. In them id 1990s, Kodak developed several digital imaging cameras, which were intended for professional photographers. The variant we used produced images in the green red and near IR, simulating CIR film. However, the application of linear drivers to read the data from the camera into the computer resulted in a device, which can be used as a portable multiband imaging polarimeter. Here we present examples to examine the potential of digital image acquisition as potential quantitative method to obtain new information on natural landscapes additional to that obtained by multiband or even hyperspectral imaging methods.
There is evidence that polarimetric contrast differences in images are band-dependent. Some previous evidence is reviewed. We fabricated and tested a hyperspectral imaging polarimeter that we reported previously. This device uses a tunable liquid crystal filter and a 16-bit camera. The polarimeter is designed to work in the visible and in the near IR spectral region. We present here some examples of imagery collected with this senor, and show how this data may be used to provide superior target discrimination if used selectively, with the appropriate algorithms.
There is evidence that polarimetric as well as intensity contrast differences in images are band-dependent. Some previous evidence is reviewed. We are currently fabricating and testing a hyperspectral imaging polarimeter to assess and to take advantage of subtle spectral detail in hyperspectral polarimetric images. This device uses a tunable liquid crystal filter and a 16-bit camera. New and varied calibration challenges have occurred and are discussed. We consider it important to present the problems as well as the successes.
KEYWORDS: Cameras, Calibration, Polarimetry, Reflectivity, Optical filters, Polarization, Digital cameras, Near infrared, Linear polarizers, Light sources and illumination
Digital cameras can be used as imaging polarimeters, as previously reported. Comparisons of the radiometric characteristics of three-band digital cameras with eight- and ten- bit radiometric precision, which we have used as imaging polarimeters are made. We also discuss preliminary calibrations on a hyperspectral imaging polarimeter based upon a 16-bit camera and an LCD tunable filter. Examples are shown to illustrate the camera radiometric characteristics and the types of calibration procedure needed for imaging polarimetry in natural and in artificial light. Procedures using primary and secondary calibration standards are discussed. The impacts of radiometric fall-off with increasing field angle and band-dependent and intensity-dependent asymmetries experienced with the hyperspectral polarimeter are discussed. We present some examples of the detail afforded in artificial illumination by cameras offering 8-bit and 12-bit radiometric depth. The importance of multiband polarimetric images, which has been stressed before is again demonstrated for artificially illuminated scenarios.
Focal plane wideband infrared digital polarization imagery will be compared with visual wideband focal plane digital imagery of a camouflaged C-130 aircraft to show the extreme enhancement possible using digital imagery. The experimental observations will be compared with theoretical calculations and modeling results of both specular and shadowed areas. The relationship of both the specular and the shadowed areas to surface structure, orientation, specularity, roughness, shadowing, orientation and complex index of refraction will be illustrated.
The imagery was obtained in four plane polarized directions with axes oriented vertically, horizontally and at plus and minus 45 degrees to the vertical. Nine locations on the aircraft were chosen (tail, fuselage, wing and propeller as well as five sky locations to establish sky background. Both sunlit and shadowed locations were examined. The direction of the dominant plane of polarization was obtained, but not the existence of circular polarization, which requires a quarter wave plate to resolve temporal coherence. Unpolarized radiation exists in the imagery,but its coherence is not evident without a phase resolving element.
In 1997 and in 1998, EMERGE obtained multi-altitude digital image data over several sites, including Oneida County Airport, using a calibrated Kodak DCS 460 CIR camera. This study was part of a larger study. During a graduate research project, we examined two multialtitude color infrared digital image sets: one was obtained under partly cloud-shadowed conditions, and the other was obtained an hour later, under cloud-free conditions. In each case, we analyzed the uncorrected images obtained at each altitude, as well as the same images corrected for the bandpass-dependent lens fall-off with field angle. The digital radiance obtained at each altitude over selected vegetation and over other targets was used to deduce the normalized difference vegetation index (NDVI). The digital radiance and the NDVI for both the raw and for the corrected images were plotted as a function of altitude. It was possible to see the impact of atmospheric differences between acquisitions, and to study the effects of lens fall-off correction, as well as the effects of cloud shadow and sun-ground-sensor geometry on the NDVI. We report only part of the study. The dependence of digital radiance and NDVI on radial distance from the image center, and on the radial distance times the sine and cosine of the azimuth of each region of interest with respect to the perpendicular to the solar plane are mentioned. However, these form a data set too large to include in its entirety here. In concurrent studies, described in these proceedings, we also analyzed multialtitude data over forest and over agricultural targets. We studied the effects of location of the site in the image, altitude and cloud shadow on contrast between scene elements. The reported results are based on one of only a very few multi-altitude studies and have implications for all other imaging sensors.
We describe here work which has been performed to calibrate digital cameras for band-dependent fall-off. We also report studies of the effects of altitude on digital camera data due to change in the size and content of the ground instantaneous field of view (GIFOV), and due to the changing atmospheric path with altitude. We report measurements of calibration targets, and some of trees and crops. We discuss the variation of signal with view geometry and field angle. We show that correction for the band-dependent lens fall-off improves the appearance of images, and the uniformity of derived vegetation indices across images. We report the on the impact of cloud shadow on vegetation index, and on the implications for flying height.
Recent work has shown that the use of a calibrated digital camera fitted with a rotating linear polarizer can facilitate the study of Stokes parameter images across a wide dynamic range of scene radiance values. Here, we show images of a MacBeth color chips, Spectralon gray scale targets and Kodak gray cards. We also consider a static aircraft mounted on a platform against a clear sky background. We show that the contrast in polarization is greater than for intensity, and that polarization contrast increases as intensity contrast decreases. We also show that there is a great variation in the polarization in and between each of the bandpasses: this variation is comparable to the magnitude of the variation in intensity.
KEYWORDS: Polarization, Cameras, Polarimetry, Optical filters, Near infrared, Vegetation, Reflectivity, Linear polarizers, Digital cameras, Digital imaging
There are many measurements of polarization made with non- imaging polarimeters. Such measurements have been made in the laboratory, of the sky and of the ground. These measurements can be interpreted only when subsidiary information enables identification of the surface under study. Some measurements have been made with imaging polarimeters based upon film, but these were limited in radiometric accuracy by the medium, or by lack of sensitometry. Some investigators fabricated a polarimeter from videcon cameras, but this study was also limited by radiometric fidelity. With the advent of digital cameras with linear focal plane radiometric response, and software retaining this linearity in extracting the image from the camera, greater radiometric accuracy has been achieved. We report here measurements of polarization which we show to be related to scene radiance. The radiance levels covered include a wide dynamic range and facilitate study of low radiance levels in general previously inaccessible to measurement using an imaging device. We also include data from previous measurements with non-imaging devices and show that they are compatible with data collected using a digital camera. There is an inverse linear relationship between the logarithm of the polarization in recorded radiance and the logarithm of the recorded radiance in data obtained with both imaging and with non-imaging polarimeters.
Relatively little work has been performed to investigate the potential of polarization techniques to provide contrast enhancement in natural scenes. Largely, this is because film is less accurate radiometrically than digital CCD FPA sensing devices. Such enhancement is additional to that provided by between-band differences for multiband data. Recently, Kodak has developed several digital imaging cameras which were intended for professional photographers. The variant we used obtained images in the green, red and near infrared, simulating CIR film. However, the application of linear drivers to rad the data from the camera into the computer has resulted in a device which can be used as a multiband imaging polarimeter. Here we examine the potential of digital image acquisition as a potential quantitative method to obtain new information additional to that obtained by multiband or even hyperspectral imaging methods. We present an example of an active on-going research program.
Relatively little work has been performed to investigate the potential of polarization techniques to provide contrast enhancement in natural scenes. Largely, this is because film is less accurate radiometrically than digital CCD FPA sensing devices. Such enhancement is additional to that provided by between-band differences for multiband data. Recently, Kodak has developed several digital imaging cameras which were intended for professional photographers. The variant we used produced images in the green, red and near IR, simulating CIR film. However, the application of linear drivers to read the data from the camera into the computer has resulted in a device which can be used as a multiband imaging polarimeter. Here we examine the potential of digital image acquisition as a potential quantitative method to obtain new information additional to that obtained by multiband or even hyperspectral imaging methods. We present an example of an active on-going research program.
Relatively little work has been performed to investigate the potential of polarization techniques to provide contrast enhancement information for vegetation mapping, and for vegetation condition assessment. Largely, this is because film is less accurate radiometrically than digital FPA sensing devices. Since polarization studies necessitate the differencing of images obtained with a linear polarizer rotated about the optic axis of a camera between sequential exposures, and since some of the differences are small, film has generally lacked the radiometric accuracy needed to reliably record such differences. Kodak has developed a high spatial resolution camera, and the development of linear drivers to read the data from the camera into the computer has resulted in a device which can be used as a multiband imaging polarimeter. Here we examine the potential of digital image acquisition as a potential quantitative method to obtain new information uncorrelated with that obtained by more conventional multiband imaging methods. Such information can potentially be used to form more sensitive vegetation indices, to differentiate species, and to penetrate canopy. We present promising examples of an active on-going research program.
The computer code SENSAT developed for radiometric investigations in remote sensing was extended to include two statistical clutter models of infrared background and the prediction of the target detection probability. The first one is based on the standard deviation of scene clutter estimated from scene data, the second one is based on the power spectral density of different classes of IR background as a function of temporal or spatial frequency. The overall code consists of modules describing the optoelectronic sensor (optics, detector, signal processor), a radiative transfer code (MODTRAN) to include the atmospheric effects, and the scene module consisting of target and background. The scene is evaluated for a certain pixel at a time. However, a sequence of pixels can be simulated by varying the range, view angle, atmospheric condition, or the clutter level. The target consists of one or two subpixel surface elements, the remaining part of the pixels represents background. Multiple paths, e.g. sun-ground-target-sensor, can also be selected. An expert system, based upon the IDL language, provides user-friendly input menus, performs consistency checks, and submits the required MODTRAN and SENSAT runs. A sample case of the detection probability of a sub-pixel target in a marine cluttered background is discussed.
Model calculations of upwelling spectral radiances at aircraft and satellite altitudes have been made to assess the capability of different current and planned sensors to extract information on the atmospheric aerosols. The visible and near infrared channels on the AVHRR, CZCS, and SeaWiFS satellite sensors were used, as well as hypothetical multichannel instruments covering 400 - 1000 nm with bandwidths of 100, 20, or 10 nm. The sensitivity to the aerosol and environmental properties increased as the bandwidth of the channel decreased.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.