The increasing prevalence of nuisance benthic algal blooms in freshwater systems has led to water quality monitoring programs based on the presence and abundance of algae. Large blooms of the nuisance filamentous algae, Cladophora glomerata, have become common in the waters of the Upper Clark Fork River in western Montana. To aid in the understanding of algal growth dynamics, unoccupied aerial vehicle (UAV)-based hyperspectral images were gathered at three field sites along the length of the river throughout the growing season of 2021. Select regions within images covering the spectral range of 400 to 850 nm were labeled based on a combination of professional judgment and spectral profiles and used to train a random forest classifier to identify benthic algal growth across several classes, including benthic growth dominated by Cladophora (Clado), benthic growth dominated by growth forms other than Cladophora (non-Clado), and areas below a visually detectable threshold of benthic growth (bare substrate). After classification, images were stitched together to produce spatial distribution maps of each river reach while also calculating the average percent cover for each reach, achieving an accuracy of approximately 99% relative to manually labeled images. Results of this analysis showed strong variability across each reach, both temporally (up to 40%) and spatially (up to 46%), indicating that UAV-based imaging with high-spatial resolution could augment and therefore improve traditional measurement techniques that are spatially limited, such as spot sampling.
A low-cost multispectral imager is described for routine monitoring of Cladophora nuisance algae and bluegreen algae in narrow rivers that are not spatially resolved by satellites. The goal is to identify algal blooms and estimate the chlorophyll a (chl a) and phycocyanin content from a network of low-cost imagers that can be mounted on bridges, trees, or other convenient objects at key river locations. The preliminary design uses Raspberry Pi cameras and computers with bandpass filters at 568, 671, 700, and 825 nm, based on data gathered with an airborne hyperspectral imager on the Upper Clark Fork River in southwestern Montana USA. This paper summarizes the initial design, calibration measurements, and preliminary reflectance data.
Harmful and nuisance algal blooms are becoming a greater concern to public health, riparian ecosystems, and recreational uses of inland waterways. Algal bloom proliferation has increased in the Upper Clark Fork River in western Montana, USA, due to a combination of warming water temperatures, naturally high phosphorus levels, and an influx of contaminants through anthropogenic nitrogen enrichment along its banks. To improve understanding of bloom dynamics, such as algal biomass, a UAV-based hyperspectral imaging system was deployed to monitor several locations along the Upper Clark Fork River. Image data were collected across the spectral range of 400 - 1000 nm with 2.1 nm spectral resolution during two field sampling campaigns in 2021. Included are methods to estimate chlorophyll a standing crops using regression analysis of salient wavelength bands, before and after separating the pigments according to growth form. Estimates of total chlorophyll a standing crops generated through a brute-force analysis are compared to in-situ data, resulting in a maximum rsquared of 0.62 for estimating filamentous plus epiphytic chlorophyll a. Estimates of total and epilithic pigment standing crops are also included. The salient wavelengths bands used to estimate these pigments were then used as the basis for creating a low-cost imaging system for identifying algal blooms.
In recent years, lidar-based remote sensing has been used for detecting and classifying flying insects, which is based upon the fact that oscillating wings produce a modulated return signal; oscillations from other objects, such as helicopters or drones, might also be detected in a similar manner. Several groups have successfully used machine learning to classify insects in laboratory settings, but data processing in field studies is still performed manually. Compared to laboratory studies, field studies pose additional challenges, such as non-stationary background clutter and high class imbalance. The models we used for detection and classification were the common boosting algorithm AdaBoost, a hybrid sampling/boosting algorithm RUSBoost, and a neural network with a single hidden layer. Previously, we found that the best performances came from the neural network and AdaBoost. In this paper, we test the machine learning models that have been trained on field data collected from Hyalite Creek on other unlabeled field data; in doing so, we demonstrate each model’s ability to detect insects in data from new, unseen environments. We Use labels created by a domain expert to manually check how many of the predicted images actually contained insects.
KEYWORDS: MATLAB, Field programmable gate arrays, Simulink, LIDAR, Feature extraction, Neural networks, Digital signal processing, Machine learning, Algorithm development
Real-time monitoring of insects has important applications in entomology, such as managing agricultural pests and monitoring species populations—which are rapidly declining. However, most monitoring methods are labor intensive, invasive, and not automated. Lidar-based methods are a promising, non-invasive alternative, and have been used in recent years for various insect detection and classification studies. In a previous study, we used supervised machine learning to detect insects in lidar images that were collected near Hyalite Creek in Bozeman, Montana. Although the classifiers we tested successfully detected insects, the analysis was performed offline on a laptop computer. For the analysis to be useful in real-time settings, the computing system needs to be an embedded system capable of computing results in real-time. In this paper, we present work-in-progress towards implementing our software routines in hardware on a field programmable gate array.
Optical remote sensing systems are often used to gather imagery of scenes containing partially polarized light. Partially polarized reflection or emission will affect the detected response if the sensor system has intentional or unintentional polarization sensitivity. As the use of optical remote sensing systems becomes more widespread, the factors affecting the response of these systems needs to be better understood. In this paper, we present the results of polarization response measurements of six hyperspectral imaging systems manufactured by Resonon Inc. The imagers included in this study cover wavelengths from approximately 350nm to 1700 nm, with various spectral sampling rates. Efforts are ongoing to model and compensate for the observed response.
A method of monitoring produce freshness with hyperspectral imaging and machine learning is described as a way to reduce food waste in grocery stores. The method relies on hyperspectral reflectance images in the visible–near-infrared spectral range from 387.12 to 1023.5 nm with a 2.12-nm spectral resolution. The images were recorded in a laboratory with the imager viewing produce samples illuminated by broadband halogen lights, but we also recorded and discussed the implications of the illumination spectrum of lights found in a variety of grocery stores. A convolutional neural network was used to perform freshness classification for potatoes, bananas, and green peppers. Additionally, a genetic algorithm (GA) was used to determine the wavelengths carrying the most useful information for age classification, with an eye toward a future multispectral imager. Hyperspectral images were processed to explore the use of RGB images, GA-selected multispectral images, and full-spectrum hyperspectral images. The GA-based feature selection method outperformed RGB images for all tested produce, outperformed hyperspectral imagery for bananas, and matched hyperspectral imagery performance for green peppers. This feature selection method is being used to develop a low-cost multispectral imager for use in monitoring produce in grocery stores.
Hyperspectral imaging is a powerful remote sensing tool capable of capturing rich spectral and spatial information. Although the origins of hyperspectral imaging are in terrestrial remote sensing, new applications are emerging rapidly. Owing to its non-destructive nature, hyperspectral imaging has become a useful tool for monitoring produce ripeness. This paper describes the process that uses a visible near-infrared (VNIR) hyperspectral imager from Resonon, Inc., coupled with machine learning algorithms to assess the ripeness of various pieces of produce. The images were converted to reflectance across a spectral range of 387.12 nm to 1023.5 nm, with a spectral resolution of 2.12 nm. A convolutional neural network was used to perform age classification for potatoes, bananas, and green peppers. Additionally, a genetic algorithm was used to determine the wavelengths carrying the most useful information for age classification. Experiments were run using RGB images, full spectrum hyperspectral images, and the genetic algorithm feature selection method. Results showed that the genetic algorithm-based feature selection method outperforms RGB images for all tested produce, outperforms hyperspectral imagery for bananas, and matches hyperspectral imagery performance for green peppers. This feature selection method is being used to develop a low-cost multi-spectral imager for use in monitoring produce in grocery stores.
As the applications of hyperspectral imaging rapidly diversify, the need for accurate radiometric calibration of these imaging systems is becoming increasingly important. When performing radiometric measurements, the polarization response of the imaging system can be of particular interest if the scene contains partially polarized objects. For example, when imaging a scene containing water, surface reflections from the water will be partially polarized, possibly affecting the response of the imaging system. In this paper, the polarization response of a Resonon, Inc. visible near-infrared (VNIR) hyperspectral imaging system is assessed across a spectral range of 400nm to 1000 nm, with a spectral resolution of 2.1 nm. Efforts are currently underway to correct for the observed polarization response of the imaging system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.