A study has been conducted on the effectiveness assessment methods of high-orbit infrared remote sensing satellites, and by analyzing the difficulties in evaluating the effectiveness of the complex star-ground system of the “human-in-the-loop”, an assessment criterion centered on the degree of mission accomplishment has been proposed. Drawing on the concept of equipment technology maturity and grading, and combining practical experience, an efficiency assessment model has been established around the construction of mission scenarios of different grades, and a class of assessment processes and assessment methods have been developed and provided as samples. A calculation method based on system reliability, repairability and availability is proposed for the comprehensive assessment of mission completion, which can improve the current assessment method in practical application. A generic quantitative method for evaluating the effectiveness of the system based on the unfulfilled or unintended tasks is investigated, which can effectively reflect the bottleneck capacity of the system and support the subsequent optimization and improvement of the system.
We focus on the restoration of ground-based space object adaptive optics (AO)
images distorted by atmospheric turbulence. A total variation (TV) blind AO images
restoration method taking advantage of low-order Gaussian derivative operators is presented.
Unlike previous definition of the TV regularization term, we propose to define the TV prior
by the Gaussian gradient operators instead of the general finite-difference gradient operators.
Specifically, in each iterative step of alternating minimization when solving the TV blind
deconvolution problem, the first-order Gaussian derivative operator (i.e. gradient magnitude
of Gaussian) is used to construct the total variation norm of object image, and the secondorder
Gaussian derivative operator (i.e. Laplacian of Gaussian) is used to spatially adjust the
regularization parameter. Comparative simulation experiments show that this simple
improvement is much practicable for ground-based space object images and can provide more
robust performance on both restoration accuracy and convergence property.
Space optical images are inevitably degraded by atmospheric turbulence, error of the optical system and motion. In order to get the true image, a novel nonnegativity and support constants recursive inverse filtering (NAS-RIF) algorithm is proposed to restore the degraded image. Firstly,the image noise is weaken by Contourlet denoising algorithm. Secondly, the reliable object support region estimation is used to accelerate the algorithm convergence. We introduce the optimal threshold segmentation technology to improve the object support region. Finally, an object construction limit and the logarithm function are added to enhance algorithm stability. Experimental results demonstrate that, the proposed algorithm can increase the PSNR, and improve the quality of the restored images. The convergence speed of the proposed algorithm is faster than that of the original NAS-RIF algorithm.
Adaptive optics (AO) in conjunction with subsequent postprocessing techniques have obviously improved the resolution of turbulence-degraded images in ground-based astronomical observations or artificial space objects detection and identification. However, important tasks involved in AO image postprocessing, such as frame selection, stopping iterative deconvolution, and algorithm comparison, commonly need manual intervention and cannot be performed automatically due to a lack of widely agreed on image quality metrics. In this work, based on the Laplacian of Gaussian (LoG) local contrast feature detection operator, we propose a LoG domain matching operation to perceive effective and universal image quality statistics. Further, we extract two no-reference quality assessment indices in the matched LoG domain that can be used for a variety of postprocessing tasks. Three typical space object images with distinct structural features are tested to verify the consistency of the proposed metric with perceptual image quality through subjective evaluation.
Atmospheric turbulence-induced wavefront deformation can be only partially corrected by adaptive optics (AO) techniques in astronomical or artificial space object imaging; an accurate estimation of the residual-wavefront phase is still needed to approach the diffraction-limited resolution. The discrete phase gradients measured by Shack-Hartmann wavefront sensors (SHWFS) can help with the estimation. In this study, we build a dynamic average slopes measurement model for SHWFS in short-exposure AO images postprocessing; the proposed model is based on a zonal representation of the wavefront phase using Bernstein basis polynomials instead of the traditional Zernike modal expansion. Further, the turbulence’s frozen flow hypothesis is adopted to update the initial model using multiframe SHWFS measurement data to achieve a more accurate reconstruction. Numerical experiments show the reconstruction errors significantly decrease even in poor seeing conditions, and show that our method is less sensitive to different SHWFS measurement noise levels.
Adaptive Optics together with subsequent post-processing techniques obviously improve the resolution of turbulencedegraded images in ground-based space objects detection and identification. The most common method for frame selection and stopping iteration in post-processing has always been subjective viewing of the images due to a lack of widely agreed-upon objective quality metric. Full reference metrics are not applicable for assessing the field data, no-reference metrics tend to perform poor sensitivity for Adaptive Optics images. In the present work, based on the Laplacian of Gaussian (LOG) local contrast feature, a nonlinear normalization is applied to transform the input image into a normalized LOG domain; a quantitative index is then extracted in this domain to assess the perceptual image quality. Experiments show this no-reference quality index is highly consistent with the subjective evaluation of input images for different blur degree and different iteration number.
As optical image becomes more and more important in adaptive optics area, and adaptive optical telescopes play a more and more important role in the detection system on the ground, and the images we get are so many that we need find a suitable method to choose good quality images automatically in order to save human power, people pay more and more attention in image’s evaluation methods and their characteristics. According to different image degradation model, the applicability of different image’s quality evaluation method will be different. Researchers have paid most attention in how to improve or build new method to evaluate degraded images. Now we should change our way to take some research in the models of degradation of images, the reasons of image degradation, and the relations among different degraded images and different image quality evaluation methods. In this paper, we build models of even noise and pulse noise based on their definition and get degraded images using these models, and we take research in six kinds of usual image quality evaluation methods such as square error method, sum of multi-power of grey scale method, entropy method, Fisher function method, Sobel method, and sum of grads method, and we make computer software for these methods to use easily to evaluate all kinds of images input. Then we evaluate the images’ qualities with different evaluation methods and analyze the results of six kinds of methods, and finally we get many important results. Such as the characteristics of every method for evaluating qualities of degraded images of even noise, the characteristics of every method for evaluating qualities of degraded images of pulse noise, and the best method to evaluate images which affected by tow kinds of noise both and the characteristics of this method. These results are important to image’s choosing automatically, and this will help we to manage the images we get through adaptive optical telescopes base on the ground.
Detailed analysis was carried out upon the modal wavefront sensor (MWFS) employed multiple holographic optical
elements(MHOEs). The distribution of diffraction field on the detector plane of the MWFS was present, and further
deduction was made to gain an analytical expression of intensity distribution of the diffraction field, which resulted in
the unification with the theory of mode-biased wavefront sensor. For sake of simplicity in numerical simulation of the
MWFS, we have done some modification to the original approach. An equivalent model of the MWFS has been
proposed, in which we choose two tilt conjugate plane waves to replace the tilt convergent spherical waves as the
reference waves in recording holograms, and place a convergent lens in close proximity to the rear surface of
holographic element. We validated the principle of the MWFS by numerical simulations employed the equivalent model.
The simulation results were consistent with the theoretical ones.
KEYWORDS: Sensors, Wavefront sensors, Wavefronts, Adaptive optics, Holography, Signal detection, Iris recognition, Confocal microscopy, Monochromatic aberrations, System on a chip
We investigated the performance of mode-biased wavefront sensor(MWFS) to detect aberration including multiple
modes. Two important parameters, Sensitivity and dynamic range, were chosen as criterions to evaluate the performance
of the MWFS. We describe the tested wavefront as the superposition of several aberration modes, and considered it in
three situations, in which the tested wavefront including: (a) only identical modes,(b) only relevant aberration modes,(c)
only irrelevant aberration modes , relative to the biased ones. We show that the existence of the above three types of
aberration modes in the tested wavefront greatly impacts the detection performance of the MWFS in terms of the
sensitivity and dynamic range.
We introduce a modal wave front sensing technique of using binary computer generated hologram (BCGH) and a coding
approach. Several types of Zernike aberrations were encoded into the BCGH using this method. Light wave front was
modulated by BCGH and single Zernike aberration mode respectively, and the holographic modal wavefront sensor was
simulated and verified. The results show that, wave front distorted by a special aberration mode, after modulated by the
BCGH, can be transformed into beams which have a relative intensity, which can reflect the change trend of the
aberration coefficients in the unknown wavefront.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.