Fringe projection profilometry (FPP) has been widely used in various fields because of its high precision and efficiency. However, fringe images containing overexposed areas are obtained by traditional FPP for high-reflective object surfaces, which leads to errors in absolute phase and three-dimensional (3D) reconstruction results. A robust 3D reconstruction technique based on a polarization state-based ternary polarization coding strategy is proposed to address these errors. Instead of the light intensity coding patterns, the polarization coding patterns are used in this method. Next, the S1 image is calculated using the polarization image captured by the camera. The Stokes parameter is utilized to segment the fringe codewords without using a specially designed threshold. The proposed method enhanced the stability of the fringes, improved the quality of the fringes, and reduced the number of fringe patterns. Experimental results verified that the proposed method has higher robustness than the conventional method in the measurement of objects with high-reflective surfaces. The results reveal that the proposed method can suppress the effect of highlight areas in the reconstruction of high-reflective surfaces to obtain accurate absolute phases and reconstruct the 3D point cloud data reliably.
Optical measurement and perception technology is widely used in the field of smart energy. The accurate calibration of the internal and external parameters of the camera in the optical system is very important for the application of the system in three-dimensional (3D) reconstruction and geometric measurement. At present, the mainstream camera calibration methods include Zhang’s calibration method and Tsai’s calibration method. These methods all choose to calculate the distortion parameters together with the camera’s internal parameters. For long focal length and narrow-field-of-view cameras with smaller perspective distortion, the coupling calculation of parameters may cause inaccurate calibration parameters and more time-consuming problems. To improve the calibration accuracy of the long focal length camera, we propose an efficient noniterative camera calibration method, based on the equation relationship between the vanishing point coordinates and the first-order single-parameter division lens distortion coefficient, and based on the radial distortion separation model as well as the corner subpixel coordinates and checkerboard 3D space points. The spatial point correspondence is solved to obtain the homography matrix to complete the calculation of the internal parameters of the camera. Our work has potential applications in photovoltaic troubleshooting and intelligent inspection. It may also contribute to the practical application of the sensor in intelligent energy.
Multicamera systems are commonly applied in the large field-of-view (FOV) measurements. However, two cameras may have a non-overlapping FOV in some scenes, and traditional binocular camera calibration methods cannot be used to directly calibrate non-overlapping cameras. To solve this problem, our study proposes a calibration method for binocular cameras with non-overlapping FOVs on the basis of planar mirrors. According to the reflection characteristics of an optical planar mirror, the proposed method can use the same target to calibrate non-overlapping cameras. Hence, the method overcomes the limitation of non-overlapping cameras being unable to observe common targets. Experiments results show that the maximum RMS error does not exceed 0.53 mm. Hence, the proposed method is effective, and its measurement technique is simpler and more universal than those of other methods. In addition, it is applicable to a wide range of measurements.
Fringe projection technology is often used for three-dimensional (3D) measurement, but it is difficult to measure highlight surfaces. Polarization systems are usually used to remove highlight surfaces. Polarizing filters can be used to eliminate the highlights of the image, but they may also cause the image to be too dark and affect the measurement accuracy. Otherwise, to ensure measurement accuracy, the complexity of the operation of the polarization system will be increased. A method of polarization-based camera intensity response function for 3D measurement is proposed. The intensity response function of the camera under the polarization system is established. It can avoid the complicated polarized bidirectional reflectance distribution function model and directly and quantitatively calculate the required angle between the transmission axes of the two polarizing filters. Then it is combined with the image fusion algorithm to generate the optimal fringe pattern. Experimental results demonstrate that this method significantly eliminates the effects of highlights in the image. The fuzzy transition area between the black and white fringes is effectively reduced, and the edge information of the fringes is correctly restored. Moreover, the high signal-to-noise ratio and contrast of the image are retained when the polarization filters are added.
It is critical to accurately obtain a phase position in a digital fringe projection system. Due to stray light in a complex environment, when projecting an image of a code fringe, the projected light intensity can be influenced. As a result, the intensity of fringe light captured with the camera may be defective, resulting in the decline of measurement precision. Hence, we propose a strong suppression technology for stray light based on polarization. Through processing fringe images in the frequency domain, this technology can effectively suppress interference of stray light on the three-dimensional (3D) imaging system. This method is verified in the experiment. The results show that the method can accurately capture the 3D profile of real-world targets under the interference of stray light.
In the structured light three-dimensional measurement system, calibration is crucial for measurement accuracy. However, conventional projector calibration methods involve complicated procedures. Therefore, a novel method is developed for projector calibration. The key concept is to establish the relationship between projector coordinates and the world coordinates using the projection checkerboard, which enables the calibration of the projector to be the same as that of a camera. In addition, the edges of the checkerboard form different degrees of high-light regions when the projection checkerboard is collected. These high-light regions affect the extraction of corner points and cause low accuracy of the projector calibration. Therefore, the projection checkerboard is collected by a CCD camera equipped with a polarizer at the optimal polarization angle. Then the subpixel corner extraction algorithm based on the homography matrix mapping is used to extract the subpixel coordinates of the corners. Finally, the projector can be calibrated with the same method as the camera. In the experimental part, the validity of the method is verified by the reprojection system.
In view of the traditional method of corner extraction, the main idea is to improve the corner extraction algorithm and ignore the imaging process of the calibration image, in this paper, an optimal polarization angle image corner extraction algorithm based on linear polarization feedback is introduced in the process of camera calibration, this method is mainly aimed at the problem of high-light regions which are difficult to detect and eliminate in the multi-position calibration image of space under natural light. The method firstly adopts the linear feedback of Stokes variable through a CCD camera with polarizing plates and obtains the corner image of the optimal angle of the checkerboard lattice in different positions in space, then we use the sub-pixel level detection algorithm and GAUSS's fitting method to precisely locate the corner points in the image and to solve the sub-pixel coordinates of the image corner, at last, the two-dimensional pixel coordinates of the corner points in each checkerboard image are extracted.
KEYWORDS: Color vision, Digital cameras, LED lighting, Transform theory, Digital imaging, Calibration, Imaging systems, Visual process modeling, Seaborgium, Machine vision
Color is often used to simplify object exaction and identification for many color-based machine vision systems. However, image colors produced by color vision system strongly depend on lighting geometry, illumination color and the spectral response function of digital camera. Either small variation in the illumination or the changes of digital camera can dramatically make the image color changed. In this paper, color correction is performed for our color vision measurement system. The mapping coefficient matrix is obtained by polynomial regression model under artificial D65 illumination and LED array illumination. The detailed correction accuracies are compared between the two common used device-independent color space (sRGB color space and CIEL*a*b*color space). sRGB color space is recommended due to the higher accuracy and simple algorithm. The correction images illustrate the usefulness of our method for color correction
This paper presents a technique to choose appropriate light source for maximizing the contrast between the object and
the background surfaces in color vision application. From the physics of color image formation, three parameters which
affect generating signal of color digital camera are researched. An optimal color illumination for enhancing color
contrast can be found by maximizing these surfaces spectral reflectance. The discrimination of these surfaces spectral
reflectance was estimated by using average color difference in CIELab color space. A printed color patch which have
seven several colored characters was used to demonstrate the approach. For each colored character, appropriate Light
Emitting Diode (LED) illumination was selected to maximize the discriminability, which is more suitable than D65
illumination. These experiments illustrate the usefulness of properly chosen color illumination in color vision
application.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.