A dual-wavelength circular scanner with collinear transmit and receive axes has been developed for use in the SEAHAWK bathymetric lidar. The scanner optics consist of an achromatic prism pair located concentrically within a 11.3” diameter dual-zone holographic optical element (HOE). This scanner achieves coaligned green and infrared beams at a 20° off- nadir scan angle when using a 50W dual-wavelength laser (30W @ 532 nm and 20W @1064 nm) as the transmitter. The main engineering challenges in achieving the design were minimizing the optical pointing error between the four optical axes (two transmit and two receive) and developing a rugged prism pair design sufficient to withstand the high laser power. The design proved sensitive to fabrication and alignment errors so success depended on analyzing optical and mechanical tolerances, acknowledging fabrication limitations, measuring critical optical components, tailoring the design to the as- built components, and utilizing a custom alignment fixture featuring a digital autocollimator. Final measurements of the deployed scanner indicate its optical pointing error has a cone half angle error of less than 0.06° (1 mrad).
SEAHAWK is a high-performance, low-SWAP LIDAR for real-time topographic and bathymetric 3D mapping applications. Key attributes include real-time waveform and point cloud processing, real-time calculation of total propagated uncertainty (TPU), a novel co-located green and infra-red transceiver architecture based on a 12” circular scanner with holographic optical element (HOE), an ultra-compact Cassegrain telescope, custom detector architecture with dynamic load modulation (DLM), and analog-to-digital converters providing improved resolution, dynamic range, and sensitivity. SEAHAWK’s design yields higher sea-surface detection percentages than other circular scanning LIDARs and thereby enables more robust sea-surface correction strategies. The real time point clouds provide sensor operators with immediate, actionable intelligence about data quality while the aircraft remains on-station.
The Asia-Pacific ocean region is one of the areas where airborne lidar is a promising tool for depth measurement. The anticipated efficiency of a laser bathymetry survey of a coastal zone in the region varies with the optical characteristics of the water. Near-shore waters in open areas of several countries (Philippines, Indonesia, Taiwan, and the east coast of South Korea) may be described as Class II in the Jerlov 1 classification (turbid tropical-subtropical water), while water properties in internal seas are described as Classes 1 to 9 (coastal waters of increasing turbidity); the optical characteristics of the coastal waters of the East China Sea are beyond the Jerlov classification. In this paper, the applicability of the CZMIL (Coastal Zone Mapping and Imaging Lidar) 2, 3 system developed by Optech is considered for lidar bathymetry in the Asia-Pacific region. The Optech CZMIL has several attributes that enable it to significantly improve seafloor detectability in shallow and, in particular, turbid waters, namely a high-energy laser, a short system response function, increased receiver sensitivity, and high point density. The system capability was tested in a relatively turbid area of the Gulf Coast of Mississippi. The maximal depth for bathymetry with the CZMIL system is estimated theoretically in various countries, accounting for the spatial and seasonal variability of the internal optical properties of near-shore water
Bathymetric lidar has been widely used for ocean floor mapping. By identifying two distinctive return peaks, one from the water surface and the other from the bottom, the water depth can be estimated. In addition to bathymetry, it is also possible to estimate the optical properties of the water by analyzing the lidar return waveform. Only the few systems (e.g. Optech’s SHOALS and CZMIL systems) that have good radiometric calibration demonstrate the capability to product the water’s inherent optical properties and bottom reflectance. As the laser pulse propagates through the water, it is scattered by the water constituents. The directional distribution of scattered radiant power is determined by the volume scattering function. Only the backscattering within a very narrow solid angle around the 180° scattering angle travels back to the detector. During the two-way travel it experiences the same optical interaction (absorption and scattering) with the water constituents. Thus, the lidar return waveform between the surface and bottom peak contains information about the vertical distribution of the water attenuation coefficient and the backscattering coefficient in the form of the rate of change of the return power. One challenge is how to estimate the inherent attenuation from the apparent attenuation. In this research we propose a technique to estimate the true water attenuation coefficient from the total system attenuation. We use a lidar waveform simulator that solves the irradiance distribution on the beam cross-section using an analytical Fourier transform of the radiance based on a single-scattering approximation.
Airborne bathymetric lidar (Light Detection and Ranging) systems measure photoelectrons on the optical path (range
and angle) at the photocathode of a returned laser pulse at high rates, such as every nanosecond. The collected
measurement of a single pulse in a time series is called a waveform. Based on the calibration of the lidar system, the
return signal is converted into units of received power. This converted value from the lidar waveform data is used to
compute an estimate of the reflectance from the returned backscatter, which contains environmental information from
along the optical path. This concept led us to develop a novel tool to visualize lidar data in terms of the returned
backscatter, and to use this as a data analysis and editing tool. The full lidar waveforms along the optical path, from laser
points collected in the region of interest (ROI), are voxelized into a 3D image cube. This allows lidar measurements to
be analyzed in three orthogonal directions simultaneously. The laser pulse return (reflection) from the seafloor is visible
in the waveform as a pronounced "bump" above the volume backscatter. Floating or submerged objects in the water may
also be visible. Similarly, forest canopies and tree branches can be identified in the 3D voxelization. This paper discusses
the possibility of using this unique three-orthogonal volume visualizing tool to extract environmental information for
carrying out rapid environmental assessments over forests and water.
CZMIL is an integrated lidar-imagery system and software suite designed for highly automated generation of physical and environmental information products for coastal zone mapping in the framework of the US Army Corps of Engineers (USACE) National Coastal Mapping Program (NCMP). This paper presents the results of CZMIL system validation in turbid water conditions along the Gulf Coast of Mississippi and in relatively clear water conditions in Florida in late spring 2012. Results of the USACE May-October 2012 mission in Green Bay, WI and Lake Erie are presented. The system performance tests show that CZMIL successfully achieved 7-8m depth in Mississippi with Kd =0.46m-1 (Kd is the diffuse attenuation coefficient) and up to 41m in Florida when Kd=0.11m-1. Bathymetric accuracy of CZMIL was measured by comparing CZMIL depths with multi-beam sonar data from Cat Island, MS and from off the coast of Fort. Lauderdale, FL. Validation demonstrated that CZMIL meets USACE specifications (two standard deviation, 2σ, ~30 cm). To measure topographic accuracy we made direct comparisons of CZMIL elevations to GPS-surveyed ground control points and vehicle-based lidar scans of topographic surfaces. Results confirmed that CZMIL meets the USACE topographic requirements (2σ, ~15 cm). Upon completion of the Green Bay and Lake Erie mission there were 89 flights with 2231 flightlines. The general hours of aircraft engine time (which doesn't include all transit/ferry flights) was 441 hours with 173 hours of time on survey flightlines. The 4.8 billion (!) laser shots and 38.6 billion digitized waveforms covered over 1025 miles of shoreline.
CZMIL is an integrated lidar-imagery sensor system and software suite designed for the highly automated generation of physical and environmental information products for mapping the coastal zone. This paper presents the results of CZMIL system validation in turbid water conditions on the Gulf Coast of Mississippi and in relatively clear water conditions in Florida in late spring 2012. The system performance test shows that CZMIL successfully achieved 7-8m depth in Kd =0.46m-1 (Kd is the diffuse attenuation coefficient) in Mississippi and up to 41m when Kd=0.11m-1 in Florida. With a seven segment array for topographic mode and the shallow water zone, CZMIL generated high resolution products with a maximum pulse rate of 70 kHz, and with 10 kHz in the deep water zone. Diffuse attenuation coefficient, bottom reflectance and other environmental parameters for the whole multi km2 area were estimated based on fusion of lidar and CASI-1500 hyperspectral camera data.
In the atmospheric correction of the CASI hyperspectral image, we found that the biggest factor is the downward
scattering of the direct solar beam in the exact direction to be reflected at the water surface to be detected by the sensor.
The downward scattering angle was calculated using navigation data, viewing geometry, and solar ephemeris. One
benefit of this approach is that it is now possible to avoid the limitations posed by the dark-pixel method. Since the
scattering angle is computed using geometry only, it is completely free from the possible trouble met by the dark-pixel
approach. In this paper, we illustrate the computational procedure and show examples of marine remote sensing data.
We extend the data fusion pixel level to the more semantically meaningful blob level, using the mean-shift algorithm to
form labeled blobs having high similarity in the feature domain, and connectivity in the spatial domain. We have also
developed Bhattacharyya Distance (BD) and rule-based classifiers, and have implemented these higher-level data fusion
algorithms into the CZMIL Data Processing System. Applying these new algorithms to recent SHOALS and CASI data
at Plymouth Harbor, Massachusetts, we achieved improved benthic classification accuracies over those produced with
either single sensor, or pixel-level fusion strategies. These results appear to validate the hypothesis that classification
accuracy may be generally improved by adopting higher spatial and semantic levels of fusion.
Integration of a bathymetric lidar and imaging spectrometer in CHARTS presented the challenge of developing new
algorithms and software for combining these two types of data. To support this development, we conducted several
field campaigns to collect airborne and in-situ data of the water column and seafloor. This work, sponsored by the
Office of Naval Research (ONR) led to development of the Rapid Environmental Assessment (REA) processor. REA
can be used to produce seafloor reflectance images from both sensors, and classification maps of the seafloor.
Ultimately, REA became the prototype software for CZMIL, and the CZMIL Data Processing System (DPS) has been
produced as a continuous refinement of REA. Here, we describe the datasets collected and illustrate results achieved
with the REA software.
A significant challenge in the CZMIL program was to develop a topographic/bathymetric lidar delivering high spatial
resolution 3D data in shallow, turbid waters, without sacrificing performance in deeper waters.
To support analysis of the trade space inherent in the design process, we developed a waveform simulator capable of
predicting CZMIL waveforms by varying parameters of the physical design and environmental properties of the seafloor
and water column.
Here, we describe the predicted performance of the proposed hardware and algorithms for generating seafloor point
clouds in a number of simulated environments.
KEYWORDS: Cameras, CZMIL, Imaging systems, Calibration, Camera shutters, Digital cameras, CCD cameras, Spatial resolution, LIDAR, Signal to noise ratio
The Coastal Zone Mapping and Imaging Lidar (CZMIL) is a multi-sensor airborne system with dedicated data fusion
software producing 3D images and maps of environmental parameters describing the beach, seafloor and water
column. To reduce overall program development risk, a commercial off-the-shelf (COTS) imaging spectrometer and
digital metric camera are used. These imagers are installed on the same mounting plate as the lidar so as to share
navigation data from a single inertial measurement unit (IMU). In this paper, we discuss the capabilities of the passive
imagers as they relate to spatial and spectral requirements of the U.S. Army Corps of Engineers (USACE) mission,
and illustrate the anticipated data coverage based on the expected deployment mode.
We have developed a combined atmospheric-oceanographic spectral optimization solution decomposing measured
airborne radiance data from the passive spectrometer into environmental parameters of interest. In this model, we hold
depth measurements from the lidar as fixed constraints, thereby gaining a degree of freedom in the solution, and
extending the solution into deeper waters than achieved with passive data alone. In this paper, we illustrate results of the
data processing procedure and assess the accuracy of estimated IOPs (Inherent Optical Properties) parameters through
comparison to in-situ measurements.
KEYWORDS: Data fusion, Data modeling, Reflectivity, LIDAR, Signal attenuation, Image fusion, CZMIL, Image classification, Double positive medium, 3D modeling
CZMIL will simultaneously acquire lidar and passive spectral data. These data will be fused to produce enhanced
seafloor reflectance images from each sensor, and combined at a higher level to achieve seafloor classification. In the
DPS software, the lidar data will first be processed to solve for depth, attenuation, and reflectance. The depth
measurements will then be used to constrain the spectral optimization of the passive spectral data, and the resulting water
column estimates will be used recursively to improve the estimates of seafloor reflectance from the lidar. Finally, the
resulting seafloor reflectance cube will be combined with texture metrics estimated from the seafloor topography to
produce classifications of the seafloor.
Range measurements in CZMIL1,2 are accomplished with signal processing techniques applied to green lidar waveforms.
In the design phase of the project, we developed software to simulate waveforms for CZMIL, and have used these
simulated waveforms to design ranging algorithms, and test their accuracies. Our results indicate the topographic ranging
accuracy to hard targets should be on the order of 2cm. In this paper, we discuss the simulations, algorithms, and results.
The Data Processing System (DPS) of the Coastal Zone Mapping and Imaging Lidar (CZMIL) has been designed to
automatically produce a number of novel environmental products through the fusion of Lidar, spectrometer, and camera
data in a single software package. These new products significantly transcend use of the system as a bathymeter, and
support use of CZMIL as a complete coastal and benthic mapping tool. The DPS provides a spinning globe capability for
accessing data files; automated generation of combined topographic and bathymetric point clouds; a fully-integrated
manual editor and data analysis tool; automated generation of orthophoto mosaics; automated generation of reflectance
data cubes from the imaging spectrometer; a coupled air-ocean spectral optimization model producing images of
chlorophyll and CDOM concentrations; and a fusion based capability to produce images and classifications of the
shallow water seafloor. Adopting a multitasking approach, we expect to achieve computation of the point clouds, DEMs,
and reflectance images at a 1:1 processing to acquisition ratio.
Estimation of water column optical properties and seafloor reflectance (532 nm) is demonstrated using recent SHOALS data collected at Fort Lauderdale, Florida (November, 2003). To facilitate this work, the first radiometric calibrations of SHOALS were performed. These calibrations permit a direct normalization of recorded data by converting digitized counts at the output of the SHOALS receivers to input optical power. For estimation of environmental parameters, this normalization is required to compensate for the logarithmic compression of the signals and the finite frequency of the bandpass of the detector/amplifier. After normalization, the SHOALS data are used to estimate the backscattering coefficient, the beam attenuation coefficient, the single-scattering albedo, the VSF asymmetry, and seafloor reflectance by fitting simulated waveforms to actual waveforms measured by the SHOALS APD and PMT receivers. The resulting estimates of these water column optical properties are compared to in-situ measurements acquired at the time of the airborne data collections. Images of green laser bottom reflectance are also presented and compared to reflectance estimated from simultaneously acquired passive spectral data.
For the past two decades, hydrographic surveyors have used Optech's bathymetric laser technology to accurately measure water depths and to describe the geometry of the shallow-water seafloor. Recently, we have demonstrated the potential to produce bottom images from estimates of SHOALS-1000T green laser reflectance, and spatial variations in the optical properties of the water column by analyzing time-resolved waveforms. We have also performed the electronic and geometric integration of an imaging spectrometer into SHOALS, and have developed a first generation of software which provides for the exploitation of the combined laser and hyperspectral data within a fusion paradigm. In this paper, we discuss relevant sensor and data fusion issues, and present recent 3D benthic mapping results.
When using airborne passive spectral data for underwater imaging, inversion of radiative transfer models is often based on the simplifying assumptions that spectral diffuse attenuation and path radiance do not vary horizontally across the project area. We have developed a technique which does not require these assumptions. In it, we use a combination of SHOALS pseudoreflectance images and passive spectral images to manually identify areas of homogeneous bottom type. At these locations, we use the measured depths from the SHOALS bathymeter to estimate the spectral diffuse attenuation coefficients, the additive path radiance of the water column, and the bottom radiance (or reflectance) at each homogeneous patch. The parameters estimated at these patches are then used as control points in the interpolation of surfaces for each parameter. In this paper, we show early results using this approach to solve the radiative transfer equations for calibrated radiance data acquired with a casi-2 imaging spectrometer, and describe our procedure for producing SHOALS pseudoreflectance images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.