KEYWORDS: RGB color model, Data modeling, Vegetation, Education and training, Performance modeling, Multispectral imaging, Object detection, Near infrared, Cameras, Shadows
Conventional agriculture relies heavily on herbicides for weed control. Smart farming, particularly through the use of mechanical weed control systems, has the potential to reduce the herbicide usage and the associated negative impact on our environment. The growing accessibility of multispectral cameras in recent times poses the question if their added expenses justify the potential advantages they offer. In this study we compare the weed and crop detection performance between RGB and multispectral VIS-NIR imaging data. Therefore, we created and annotated a multispectral instance segmentation dataset for sugar beet crop and weed detection. We trained Mask-RCNN models on the RGB images and on images composed of different vegetation indices calculated from the multispectral data. The outcomes are thoroughly analysed and compared across various scenarios. Our findings indicate that the use of vegetation indices can significantly improve the weed detection performance in many situations.
Deep learning techniques are commonly utilized to tackle various computer vision problems, including recognition, segmentation, and classification from RGB images. With the availability of a diverse range of sensors, industry-specific datasets are acquired to address specific challenges. These collected datasets have varied modalities, indicating that the images possess distinct channel numbers and pixel values that have different interpretations. Implementing deep learning methods to attain optimal outcomes on such multimodal data is a complicated procedure. To enhance the performance of classification tasks in this scenario, one feasible approach is to employ a data fusion technique. Data fusion aims to use all the available information from all sensors and integrate them to obtain an optimal outcome. This paper investigates early fusion, intermediate fusion, and late fusion in deep learning models for bulky waste image classification. For training and evaluation of the models, a multimodal dataset is used. The dataset consists of RGB, hyperspectral Near Infrared (NIR), Thermography, and Terahertz images of bulky waste. The results of this work show that multimodal sensor fusion can enhance classification accuracy compared to a single-sensor approach for the used dataset. Hereby, late fusion performed the best with an accuracy of 0.921 compared to intermediate and early fusion, on our test data.
The choice of an appropriate illumination design is one of the most important steps in creating successful machine
vision systems for automated inspection tasks. In a popular technique, multiple inspection images are captured
under angular-varying illumination directions over the hemisphere, which yields a set of images referred to as
illumination series. However, most existing approaches are restricted in that they use rather simple patterns
like point- or sector-shaped illumination patterns on the hemisphere. In this paper, we present an illumination
technique which reduces the effort for capturing inspection images for each reflectance feature by using linear
combinations of basis light patterns over the hemisphere as feature-specific illumination patterns. The key idea
is to encode linear functions for feature extraction as angular-dependent illumination patterns, and thereby to
compute linear features from the scene's reflectance field directly in the optical domain. In the experimental
part, we evaluate the proposed illumination technique on the problem of optical material type classification of
printed circuit boards (PCBs).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.