PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This PDF file contains the front matter associated with SPIE Proceedings volume 11397, including the Title Page, Copyright information and Table of Contents
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Pattern projection-based 3D sensors are widely used for contactless, non-destructive optical 3D shape measurements. In previous works, we have shown 3D measurement systems based on stereo matching between two cameras with GOBO-projected aperiodic fringe patterns. In this contribution, we demonstrate a method to optimize the projection patterns for high measurement robustness, i.e., high completeness of the resulting point cloud with low probability of outliers. To calculate the 3D coordinates of an object point by triangulation, a pixel correspondence between the two cameras must be found. The search for such pixel correspondences can be broken into two parts: a coarse correspondence search and a sub-pixel-accurate refinement. The former is responsible for the completeness and correctness of the 3D result, while the quality of the latter determines the accuracy. The correctness of the correspondence search depends on the property of the projection pattern to uniquely encode each point on the measurement object. If the pattern is very self-similar, the points are not well distinguishable from each other and there is a high probability of mismatches during correspondence search. We introduce a mathematical measure to evaluate the self-similarity of a GOBO-projected fringe pattern. This measure operates on patterns, which we simulate with a simplified 1D model. Based on this measure and its derivatives, we developed an algorithm to optimize the fringe patterns. We compare results achieved with unoptimized and optimized fringe patterns.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
For providing holistic and complete dataset of sheet bulk metal-formed parts, an optical detection with a fringe projection system is useful. By combining several sensors with varying measuring ranges, the workpiece is captured with adapted spatial resolution depending on the forming zone and geometric requirements. The sensors are registered by a two-dimensional point calibration and registration process. The registration process has already been shown in.1 In this article, the calibration procedure is transferred to the measurement setup with three fringe projection sensors with different measurement resolution. A calibration plate with dot patterns is applied to the positioning unit, which is a high precision hexapod in this set-up. This extended calibration plate allows a continuous control of the registration quality as the sensors are able to capture the calibration plate at any time. For determining the positioning accuracy of the hexapod, a procedure following the DIN EN ISO 10360-3 procedure with three spheres mounted on the hexapod was investigated. Therefore the hexapod was fixed at a coordinate measuring machine Zeiss UPMC 1200 and moved in all 6 degrees of freedom. After each movement, the spheres were measured with a probe, which allows the absolute position precision to be calculated. Comparison of the measurements with the specified translation and rotation movement of the hexapod show a positioning precision of less than 10 µm in the positioning directions x and y.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this work, we developed a multimodal imaging system for real-time applications by integrating 2D image sensors in different spectral ranges as well as a polarization camera into a high-speed optical 3D sensor. For the generation of the multimodal image data, a pixel-level alignment of 2D images in different modalities to 3D data is realized by applying projection matrices to each point in the 3D point cloud. For the calculation of projection matrices for each 2D image sensor, a calibration procedure is proposed for the extrinsic calibration of arbitrarily positioned image sensors. The final imaging system delivers multimodal video data with one mega-pixel resolution at a frame rate of 30 Hz. As application examples, we demonstrate the estimation of vital signs and the detection of human body parts with this imaging system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Optical 3D measurement using active pattern projection is well known for its high precision and high 3D point density. Recently, increasing the reconstruction frame rate and the number of active sensors in a simultaneous and continuous operation regime used for sensor networks has become more important. Traditionally, light modulators such as LCoS, DMD, or GOBO (GOes Before Optics) have been used, which generate the projected pattern by blocking the light at dark areas of the pattern. In order to further increase the measurement speed and/or the number of time-sequential continuously active sensors, brighter light sources must be chosen to achieve sufficient short exposure times. Alternatively, as we show in this paper, a more efficient pattern modulator can be used. By applying an optical freeform element to generate an aperiodic sinusoidal fringe pattern, up to 100 % of the available light can be utilized. In our prototype, we show how to employ a freeform element moved in a linear bearing to create a compact low-cost, high-speed projection unit. Furthermore, to reduce the computational burden in processing numerous simultaneous image streams, we have implemented the rectification step of the 3D reconstruction pipeline into the field programmable gate array (FPGA) sensor module. Both approaches enable us to use structured light sensors for continuous high-speed 3D measurement tasks for industrial quality control. The presented prototype utilizes a single irritation-free near-infrared (NIR) LED to illuminate and reconstruct within a measurement field of approximately 300 mm × 300 mm at a measurement distance of 500 mm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The use of interferometry for very high accuracy area mapping of the flatness of optical surfaces is a well-known, high-performance tool. However, it is difficult to make a high-performance measurement over a full area of a diffuse or scattering surface such as a machined metal or metal powder used in manufacturing. This challenge is generally addressed with scanning methods using laser lines, confocal or other point scanners of mechanical means such as coordinate measuring machines. Data collection with these method can be very slow, due in part from the flexibility to measure a large change in contour that is not critical to most flatness applications. Previously, we reported on a moire method for diffuse or powder surfaces that is not unlike the Ronchi tests using in process of ground glass. This paper discussed some of the challenges of making this type of measurement and suggests a new hybrid method that trades of the capability for providing profiles over relatively large measurement ranges for obtaining very high sensitivity to errors from the ideal flat surface
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper introduces a novel uniaxial fringe projection profilometry (FPP) called active shape from projection defocus profilometry (ASPDP), which utilizes the sharpness analysis of binary fringe patterns to quantify the defocus level. Compared with previous uniaxial FPP methods, our work first utilizes a pinhole defocus model to account for three-dimensional reconstruction. Since defocused fringe pattern can be modeled as the original pattern convoluted with a point spread function (PSF), pixel-wise defocus level can be quantified with this PSFs kernel using temporal Fourier analysis. In this research, calibration is achieved by using a mechanical translation device, and determined by rational polynomial fitting to establish defocus-depth relationship. The experiment demonstrates that this method can provide an accurate reconstructed 3D geometry without shadows.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Three dimensional (3D) topology data obtained from different optical metrology techniques tend to produce local disagreements which may yield incorrect judgement from inspectors especially under scenarios of precision metrology. This research explores statistical methods to provide a functional scoring for similarities. The investigation is conducted using two statistical methods (Pearsons correlation coefficient and image distance), two optical techniques (structured light and focus variation microscopy) and two application scenarios (metal additive printing and ballistic forensic examination). Experimental results show the promise of using statistical tools to assist binary decisions for matching/non-matching even if 3D topology data are obtained from different optical techniques.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The implementation and generation of synthetic data for testing algorithms in optical metrology are often difficult to reproduce. In this work, we propose a framework for the generation of reproducible synthetic surface data. We present two study cases using the Code Ocean platform, which is based on Docker and Linux container technologies to turn source code repositories into executable images. i) We simulate interference pattern fringe images as acquired by a Michelson interferometric system. The reflectivity changes due to surface topography and roughness. ii) We simulate phase maps from rough isotropic surfaces. The phase data is simultaneously corrupted by noise and phase dislocations. This method relies on Gaussian-Laplacian pyramids to preserve surface features on different scales. The proposed framework enables reproducible surface data simulations, which could increase the impact of algorithm development in optical metrology.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
It has become customary to calibrate a camera-projector pair in a structured light (SL) system as a stereo-vision setup. The 3D reconstruction is carried out by triangulation from the detected point at the camera sensor and its correspondence at the projector DMD. There are several algebraic formulations to obtain the coordinates of the 3D point, especially in the presence of noise. However, it is not clear what is the best triangulation approach. In this study, we aimed to determine the most suitable triangulation method for SL systems in terms of accuracy and execution time. We assess different strategies in which both coordinates in the projector are known (point-point correspondence) and the case in which only the one coordinate in the DMD is known (pointline correspondence). We also introduce the idea of estimating the second projector coordinate with epipolar constraints. We carried out simulations and experiments to evaluate the differences between the triangulation methods, considering the phase-depth sensitivity of the system. Our results show that under suboptimal phasedepth sensitivity conditions, the triangulation method does influence the overall accuracy. Therefore, the system should be arranged for optimal phase-depth sensitivity so that any triangulation method ensures the same accuracy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A universal pixel-by-pixel distortion-free camera calibration technique is described. All camera lenses will cause image distortions. An LCD flat panel can be used as an active calibration panel for camera calibration. Each sensor pixel has its ray vector in space and can be calibrated with the LCD. A set of phase shifting fringes (PSF) can be used to establish the mapping relationship between the LCD pixels and the sensor pixels. For primary camera calibration, a virtual sensor can be created on the LCD and a set of inverse mapping parameters (IMP) for each virtual pixel can be determined. The captured images can be rectified by resampling with the IMP. The output images will be distortion-free with zero geometric distortions and zero chromatic aberrations. For advanced camera calibration, all pixel ray vectors in space can be calibrated. A virtual sensor can be created on any expected planar or curvature surface in space and the IMP can be determined accordingly. After image rectification or 3D reconstruction, for every 2D pixel or 3D point cloud, the mean error will be 0, the std error will be 1/1,000 pixel pitch or smaller. The cameras can be used as non-contact 2D/3D rulers. The distortion-free calibration technique can be applied to any cameras and projectors, no matter how complex their lens structures can be. Interactive and comprehensive intensity calibrations can be made between LCD, cameras and projectors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
One of the most challenging applications of optical metrology methods is the measurement of large parts. These are parts that may be several meters in size such as used in large machinery or structures such as vehicles but still require a reliable measurement to achieve optimal performance and operational characteristics. A typical resolution in such applications may only be to tenths of a millimeter, but the relation ship of one point to another need to be known over many meters to this same resolution. Factors such as changes in the environment, warpage of the part and even drift in any holding fixtures can all complicate obtaining a reliable measurement of a large part. This paper will explore both the methods and the challenges typical in such application with example of potential means to mitigate these challenges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Remote sensing of the Earth allows receiving information of medium, high spatial resolution, and hyperspectral measurements from spacecraft. The paper studies the percolation processes of the underlying surface of the objects of distribution of household and industrial waste. An algorithm for constructing a percolation threshold is proposed. The method of labeling clusters is used. A mathematical model of the percolation process has been developed. This technique is directly related to the physicochemical processes of the landfill, which can be estimated from space monitoring data. The purpose of the work is to develop and propose a methodology for assessing the percolation parameters of landfills from space images. The results of the proposed algorithm are shown on the example of the Salaryevo solid household and the industrial waste landfill (Leninsky district of the Moscow region).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.