Crop pest management has vast economic consequences vital to agricultural, natural lands, and even public health. In this work, we contribute a new low-cost UGV (unmanned ground vehicle) solution to solve this precision agriculture problem. This UGV is made of two electric bikes (bi-eBike) that are commercially off the shelves (COTS) and driven by a ROS (robotics operational system) compatible UGV. Our bi-eBike system offers a mobile platform that can be used as a mobile sensing service with multiple sensors such as multispectral cameras, microwave scanners, etc., as well as a mobile actuation/application service with actuators such as UV-C light insecticide, beneficial bugs, growth stimulant spreaders or sprayers, etc. By mapping and imaging plants in the field, farmers can treat individual plants instead of troubleshooting the entire field, reducing both their costs and negative environmental impact. This smart bi-eBike system can be supplemented with solar panels (photovoltaic) and a UAV (unmanned aircraft vehicle) landing/charging pad. Thus, one can expect that the SBB can have a long operational duration (10 hours or more) and large coverage of acreages where UAVs are used for variability mapping for site-specific treatment. This paper describes the system level concept, subsystem designs and integration, vehicle control electronics, autonomous navigation architecture, and some preliminary experimental results.
The normalized difference vegetation index (NDVI) has been commonly used for vegetation monitoring, such as water stress detection, crop yield assessment, and evapotranspiration estimation. However, the influence of spatial resolution on the individual tree level NDVI using the Unmanned Aerial Vehicles (UAVs) is poorly understood. Therefore, in this research, the effects of the spatial resolution of UAV imagery are investigated using high-resolution multispectral images. A temporal sequence of UAV multispectral imagery was collected over an experimental pomegranate field, capturing variations in the whole growing season of 2019, at the USDAARS (U.S. Department of Agriculture, Agricultural Research Service) San Joaquin Valley Agricultural Sciences Center in Parlier, California, USA. The NDVI distribution of individual trees was generated at the 60 m, 90 m, and 120 m spatial resolution. Experimental results indicated how the spatial resolution of UAV imagery could affect NDVI values of individual trees.
Evapotranspiration (ET) estimation is important agricultural research in many regions because of the water scarcity, growing population, and climate change. ET can be analyzed as the sum of evaporation from the soil and transpiration from the crops to the atmosphere. The accurate estimation and mapping of ET are necessary for crop water management. One traditional method is to use the crop coefficient (Kc) and reference ET (ETo) to estimate actual ET. With the advent of satellite technology, remote sensing images can provide spatially distributed measurements. Satellite images are used to calculate the Normalized Difference Vegetation Index (NDVI). The relation between NDVI and Kc is used to generate a new Kc. The spatial resolution of multispectral satellite images, however, is in the range of meters, which is often not enough for crops with clumped canopy structures, such as trees and vines. Moreover, the frequency of satellite overpasses is not high enough to meet the research or water management needs. The Unmanned Aerial Vehicles (UAVs) can help mitigate these spatial and temporal challenges. Compared with satellite imagery, the spatial resolution of UAV images can be as high as centimeter-level. In this study, a regression model was developed using the Deep Stochastic Configuration Networks (DeepSCNs). Actual evapotranspiration was estimated and compared with lysimeter data in an experimental pomegranate orchard. The UAV imagery provided a spatial and tree-by-tree view of ET distribution.
Soil-borne plant-parasitic nematodes exist in many soils. Some of them can cause up to 15 to 20 percent annual yield losses. Walnut has high economic value, and most edible walnuts in the US are produced in the fertile soils of the California Central Valley. Soil-dwelling nematode parasites are a significant threat, and cause severe root damage and affect the walnut yields. Early detection of plant-parasitic nematodes is critical to design management strategies. In this study, we proposed use of a new low-cost proximate radio frequency tridimensional sensor "Walabot" and machine learning classification algorithms. This pocket-sized device, unlike the remote sensing tools such as unmanned aerial vehicles (UAVs), is not limited by ight time and payload capability. It can work flexibly in the field and provide data information more promptly and accurately than UAVs or satellite. Walnut leaves from trees of different nematodes infestation levels were placed on this sensor, to test if the Walabot can detect small changes of the nematode infestation levels. Hypothetically, waveforms generated by different signals may be useful to estimate the damage caused by nematodes. Scikit-learn classification algorithms, such as Neural Networks, Random forest, Adam optimizer, and Gaussian processing, were applied for data processing. Results showed that the Walabot predicted nematodes infestation levels with an accuracy of 72% so far.
In the last decade, technologies of unmanned aerial vehicles (UAVs) and small imaging sensors have achieved a significant improvement in terms of equipment cost, operation cost and image quality. These low-cost platforms provide flexible access to high resolution visible and multispectral images. As a result, many studies have been conducted regarding the applications in precision agriculture, such as water stress detection, nutrient status detection, yield prediction, etc. Different from traditional satellite low-resolution images, high-resolution UAVbased images allow much more freedom in image post-processing. For example, the very first procedure in post-processing is pixel classification, or image segmentation for extracting region of interest(ROI). With the very high resolution, it becomes possible to classify pixels from a UAV-based image, yet it is still a challenge to conduct pixel classification using traditional remote sensing features such as vegetation indices (VIs), especially considering various changes during the growing season such as light intensity, crop size, crop color etc. Thanks to the development of deep learning technologies, it provides a general framework to solve this problem. In this study, we proposed to use deep learning methods to conduct image segmentation. We created our data set of pomegranate trees by flying an off-shelf commercial camera at 30 meters above the ground around noon, during the whole growing season from the beginning of April to the middle of October 2017. We then trained and tested two convolutional network based methods U-Net and Mask R-CNN using this data set. Finally, we compared their performances with our dataset aerial images of pomegranate trees. [Tiebiao- add a sentence to summarize the findings and their implications to precision agriculture]
Many studies have shown that hyperspectral measurements can help monitor crop health status, such as water stress, nutrition stress, pest stress, etc. However, applications of hyperspectral cameras or scanners are still very limited in precision agriculture. The resolution of satellite hyperspectral images is too low to provide the information in the desired scale. The resolution of either field spectrometer or aerial hyperspectral cameras is fairly high, but their cost is too high to be afforded by growers. In this study, we are interested in if the flow-cost hyperspectral scanner SCIO can serve as a crop monitoring tool to provide crop health information for decision support. In an onion test site, there were three irrigation levels and four types of soil amendment, randomly assigned to 36 plots with three replicates for each treatment combination. Each month, three onion plant samples were collected from the test site and fresh weight, dry weight, root length, shoot length etc. were measured for each plant. Meanwhile, three spectral measurements were made for each leaf of the sample plant using both a field spectrometer and a hyperspectral scanner. We applied dimension reduction methods to extract low-dimension features. Based on the data set of these features and their labels, several classifiers were built to infer the field treatment of onions. Tests on validation dataset (25 percent of the total measurements) showed that this low-cost hyperspectral scanner is a promising tool for crop water stress monitoring, though its performance is worse than the field spectrometer Apogee. The traditional field spectrometer yields the best accuracy as high as above 80%, whereas the best accuracy of SCIO is around 50%.
Thermal cameras have been widely used in small Unmanned Aerial Systems (sUAS) recently. In order to analyze a particular object, they can translate thermal energy into visible images and temperatures. The thermal imaging has a great potential in agricultural applications. It can be used for estimating the soil water status, scheduling irrigation, estimating almond trees yields, estimating water stress, evaluating maturity of crops. Their ability to measure the temperature is great, though, there are still some concerns about uncooled thermal cameras. Unstable outdoor environmental factors can cause serious measurement drift during flight missions. Post-processing like mosaicking might further lead to measurement errors. To answer these two fundamental questions, it finished three experiments to research the best practice for thermal images collection. In this paper, the thermal camera models being used are ICI 9640 P-Series, which are commonly used in many study areas. Apogee MI-220 is used as the ground truth. In the first experiment, it tries to figure out how long the thermal camera needs to warm up to be at (or close to) thermal equilibrium in order to produce accurate data. Second, different view angles were set up for thermal camera to figure out if the view angle has any effect on a thermal camera. Third, it attempts to find out that, after the thermal images are processed by Agisoft PhotoScan, if the stitching has any effect on the temperature data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.