Open Access
31 January 2018 Enhancing imaging contrast via weighted feedback for iterative multi-image phase retrieval
Cheng Guo, Qiang Li, Xiaoqing Zhang, JiuBin Tan, Shutian Liu, Zhengjun Liu
Author Affiliations +
Abstract
Iterative phase retrieval (IPR) has developed into a feasible and simple computational method to retrieve a complex-valued sample. Due to coherent illumination, the reconstructed image quality is degraded by speckle noise arising from a laser. Accordingly, partially coherent illumination has been introduced to alleviate this restriction. We apply weighted feedback modality into multidistance and multiwavelength phase retrieval to realize high-contrast and fast imaging. In simulation, it is proved that IPR based on weighted feedback accelerates the convergence in partially coherent illumination and speckle illumination. In experiment, the resolution chart and biological specimen are reconstructed in lensless and lens-based systems, which also demonstrate the performance of weighted feedback. This work provides a simple and high-contrast imaging modality for IPR. Also, it facilitates compact and flexible experimental implementation for label-free imaging.

1.

Introduction

As a beam of light passes through a sample, the cumulation of phase delay reflects much intrinsic information on the sample, such as thickness, refractive index, and composition.1,2 However, the current detection device (CCD or CMOS) is incapable of acquiring the phase data, which led to the appearance of phase retrieval techniques. Technically, the lost phase could be reconstructed by an interferometric system3 and computational imaging.4,5 Without the reference beam, the computational imaging method has been developed by researchers with great interest and has been successfully applied in super resolution,68 three-dimensional imaging,9,10 and quantitative phase imaging.11,12 At present, this technique is classified into two types, namely transport of intensity equation (TIE)1315 and iterative phase retrieval (IPR) method.16,17 TIE is a direct numerical phase solver so that it does not require phase unwrapping, and thus computationally efficient. But TIE merely retrieves the phase-only or amplitude-only object, which is not suitable for imaging a complex-valued sample. On the contrary, IPR method could reconstruct complex-valued images of different samples.1820

As the origin, the Gerchberg–Saxton algorithm16 reconstructed the object’s phase via computationally propagating back-and-forth between the real and reciprocal space and imposing the constraints from a pair of amplitude distributions. The hybrid input output algorithm17 replaced the requirement of the known amplitude distribution in real space with a loose support and introduced the feedback to escape the stagnation. But these two algorithms are both sensitive to the initial guess and have to get a rough guess of the object for a better reconstruction. Alternatively, multi-image phase retrieval is capable of achieving high-accuracy image reconstruction by means of measurement diversity. Without prior knowledge, the ptychographic iterative engine (PIE) algorithm2123 retrieves a complex-valued object from a series of diffraction patterns obtained by an overlapped pinhole-scanning across the sample. Apart from the lateral scanning strategy, there emerged plenty of other different forms to introduce degrees of freedom in the imaging system, including multidistance,2427 multiwavelength,28,29 multibeam illumination,30 and spatial light modulation.31 Unlike the PIE method, multidistance phase retrieval (MDPR)25 transversely records diffraction patterns at different positions and iteratively computes the complex amplitude of the object, whose stability and robustness have been demonstrated.26,27 Multiwavelength phase retrieval (MWPR)28 has a similar performance but utilizes the multiwavelength illumination. These two methods without the need of lateral shift along x- and y-directions actually reduce the complexity of experiment.

However, the imaging quality of these methods is severely hampered by speckle noise from the coherent light source. To alleviate the speckle noise effect, partially coherent illumination32,33 has been adopted, which effectively reduces the speckle noise but holds the coherence assumption according to the van Cittert–Zernike theorem. Zheng et al.6 mounted a programmable light-emitting diode (LED) array in conventional wide-field microscopy for multiangle illumination and retrieved quantitative complex field distribution of the sample. Similarly, Tian et al.,34 Chen et al.,35 and Lee et al.36 utilized patterned LED illumination for bright-field, dark-field, and phase-contrast imaging. Until now, partially coherent illumination has become a popular and feasible strategy to realize the high-resolution imaging with low-cost hardware. In our previous work,37 the weighted feedback was proposed to accelerate the convergence of MDPR under coherent illumination of a fiber laser. But the imaging contrast is heavily obstructed by speckle noise. Also, the reconstruction of the translucent sample is incompetent in that case.

In this work, we will show that IPR based on weighted feedback acceleration modality can easily realize high-contrast and fast-converging reconstruction for different samples under partially coherent and speckle illumination, which are demonstrated by simulation and experiment in both lensless and lens-based systems. For the lensless system, a programmable LED array is used for partially coherent illumination, and the imaging quality of MDPR is demonstrated to be better compared with the fiber laser illumination. By imbedding weighted feedback, the imaging contrast of a biological specimen is enhanced for both MDPR and MWPR methods. To further exhibit its performance, we apply MDPR in the application of noninvasive imaging through the scattering layer, in which the convergence speed is significantly improved. For the lens-based system, MDPR is utilized to image the phase of a translucent sample in a conventional microscopy and the weighted feedback also takes effect.

The rest of this article is arranged as follows. The theory of IPR and its weighted feedback modality are described in Sec. 2. The corresponding simulation and experiment results are given in Secs. 3 and 4, respectively. Conclusions are presented in Sec. 5.

2.

Theory

In MDPR, a set of diffraction patterns recorded downstream of the object plane repeatedly constrains the object estimate until the full complex field of the object is obtained. Here, the amplitude-phase retrieval algorithm25 is assigned to achieve MDPR, and its schematic diagram is shown in Fig. 1.

Fig. 1

MDPR based on weighted feedback: (a) the diagram and (b) the algorithmic flowchart.

JBO_23_1_016015_f001.png

As shown in Fig. 1(a), diffraction patterns In(n[1,N]) are measured in the downstream of the sample. The transverse distance (Zn=Z0+(n1)d,n[1,N]) is composed of two components: initial distance Z0 and equivalent interval d. In Fig. 1(b), the complex amplitude of the sample is initialized with zero matrix. The algorithmic flowchart of MDPR should follow: (1) k’th estimation of the object’s complex field Gk propagates forward to the recording plane and thus generates N computed patterns by different transverse distances Zn; (2) replacing the amplitude of computed patterns with the modulus of recorded diffraction patterns and retaining the computed phase; (3) these synthesized patterns propagate backward to the object plane and N guesses of the object gnk(n[1,N]) are produced; (4) k+1’th estimation of object Gk+1 is obtained by the average of N guessed data (gnk,n[1,N]), especially, this average operation is separately executed for amplitude and phase; and (5) running iteratively from steps (1) to (4) until the reconstructed accuracy meets the required threshold. Weighted feedback operation is imposed between step (3) and (4) as

Eq. (1)

g˜nk=(1+a+b)gnkag˜nk1bg˜nk2,
where g˜nk denotes the k’th modulated object guess. The symbols a and b are the feedback coefficients, which are parameterized as 0.7 and 0.5 in Ref. 37, respectively. Within this definition, the k+1 estimation Gk+1 is calculated by the average of N modulated object guesses (g˜nk). If iterative number k=1 or 2, let g˜nk=gnk, which means that weighted feedback starts when k>2. Here, MDPR based on weighted feedback is termed as the MDPRF algorithm in the following context.

The method in Ref. 28 is utilized as the MWPR algorithm for test. Its weighted version is named as the MWPRF algorithm for short. Here, the MWPR algorithm implements multiple patterns recorded under different wavelengths (λ1,λ2,λ3) for image reconstruction, which is defined in Fig. 2(a). Its algorithmic details about the MWPRF algorithm are shown in Fig. 2(b) as follows: (1) initializing the complex field of the object with zeros matrix; (2) plugging λ1 and propagating k’th estimation of object Gk forward to recording plane with a distance of Z0; (3) replacing the computed amplitude of recording plane with the modulus of recorded diffraction pattern and retaining the computed phase; (4) inversely propagating this synthesized complex amplitude backward to the object plane; (5) using next wavelength until all wavelengths are scanned and thus the k+1’th estimation of object Gk+1 is obtained; and (6) iteratively running steps (2) to (5) so that the reconstructed accuracy meets the given requirement. The corresponding weighted feedback operation is imbedded between steps (5) and (6), which is expressed as

Eq. (2)

Gk+1=(1+a+b)Gk+1aGkbGk1,
where the coefficients a and b are parameterized as 0.7 and 0.5, respectively. For MWPRF, the first (k=1) and second estimations (k=2) are similar to MWPR’ ones. Only if k>2, weighted feedback operation starts working effectively.

Fig. 2

MWPR based on weighted feedback: (a) the diagram and (b) the algorithmic flowchart.

JBO_23_1_016015_f002.png

3.

Simulation

3.1.

Partially Coherent Illumination

In this section, numerical simulation is presented to prove the capability of the proposed idea. To quantitatively show the reconstructed accuracy, we utilize the normalized correlation coefficient (NCC) between the reconstructed image f(x,y) and the ground truth I0(x,y) as the metric function, which is defined as

Eq. (3)

Cf,I0=1M0N0x=1M0y=1N0[f(x,y)f¯(x,y)][I0(x,y)I¯0(x,y)],

Eq. (4)

NCC=Cf,I0CI0,I0Cf,f,
where Cf,I0 is the covariance matrix of the reconstructed image and the ground truth, which is an indicator of how much two images match each other. Here, M0N0 denotes the total pixel numbers of the object image. The value of NCC ranges from [0, 1]. With the increase of NCC, the information of two images becomes closer to each other.

The kernel of IPR is the propagation computation. In this paper, the diffraction computation is defined in the Fresnel regime. Thus, all propagations are computed by the angular spectrum formula as

Eq. (5)

HF(ξ,η)={exp[2πjZnλ1(λξ)2(λη)2],(λξ)2+(λη)2<1,0,otherwise,
where λ is the illumination wavelength. Here, (ξ,η) denotes the coordinate in frequency domain. In this case, any diffraction patterns placed in any transverse distances Zn are able to be obtained by F1[F(Gk)HF], where F and F1 represent the Fourier transform and its inverse version, respectively. To simulate the partial coherence of diffraction recording, the object plane around its original position is vibrated by 100 random plates to generate 100 object planes. These 100 object planes propagate a transverse distance of Zn so that 100 diffraction patterns are produced. The desired modulus of partially coherent diffraction pattern in Zn is calculated by averaging the modulus of 100 computed patterns. Repeatedly running this procedure, it is workable to get a set of multidistance or multiwavelength intensity images under partially coherent illumination. This partial coherence calculation is explicitly described in Ref. 38.

Here, the image “cameraman” is chosen as the ground truth image and simulated parameters are listed as follows: (1) the imaging size is 3×3  mm2 (256×256  pixels); (2) λ=532  nm; (3) Z0=20  mm, d=1  mm; and (4) the recording number N is 3, 8, 11, and 13. One of 13 recorded diffraction patterns is shown in Fig. 3(a). The corresponding convergence curve is shown in Fig. 3(b). It is easy to observe the improvement of convergence speed by MDPRF. With the recording number >11, there is no predominant impact of increasing recording number on the convergence. So in this case, the convergence merely depends on the iterations. To visually exhibit the comparison of MDPR and MDPRF, the reconstructed images using 11 diffraction patterns after 10, 30, and 50 iterations are shown in Figs. 3(c)3(e) for MDPR and Figs. 3(f)3(h) for MDPRF. Obviously, the degradation of slow-rate convergence happens on the edge of the man at lower iterations, which actually wraps all around “the man” with a vague halo in MDPR results. But this degradation is visibly ironed out by MDPRF. Also, the iterative number reaching to convergence is 50 for the MDPRF algorithm, which is superior to MDPR’s results.

Fig. 3

The retrieval of MDPR and MDPRF with Z0=20  mm and d=1  mm under partially coherent illumination: (a) recorded diffraction pattern in Z1=20  mm, (b) the convergence curve of MDPR and MDPRF with 3, 8, 11, and 13 diffraction patterns after 100 iterations, and (c)–(e) and (f)–(h) are reconstructed images by MDPR and MDPRF with 11 diffraction patterns, respectively.

JBO_23_1_016015_f003.png

Within the computation of partial coherency, the reconstructed images of MWPR and MWPRF algorithms are shown in Figs. 4(a)4(d). The convergence curve is shown in Fig. 4(e). The simulated parameters are listed as follows: (1) the imaging size is 3×3  mm2 (256×256  pixels); (2) λ1=467  nm, λ2=532  nm, λ3=623  nm, and (3) Z0=20  mm. The convergence performance is explicitly improved by weighted feedback in Fig. 4. The NCC curves in Figs. 3(b) and 4(e) prove that the problems of the degradation and stagnation are actually worked out by weighted feedback under partially coherent illumination.

Fig. 4

The retrieval of MWPR and MWPRF with Z0=20  mm under partially coherent illumination: (a)–(d) are reconstructed images by MWPR and MWPRF with the wavelength of 467, 532, and 623 nm after 10 and 100 iterations and (e) the convergence curve of MWPR and MWPRF.

JBO_23_1_016015_f004.png

3.2.

Speckle Illumination

Measuring intensity patterns in the volume speckle field could lead to a unique and accurate image reconstruction of the object.3941 To test the feasibility of our method in speckle illumination, we place the object in the speckle field and retrieve it using multiple intensity patterns. Here, the object in Fig. 5(a) is selected as the ground truth image, and its incident pattern is shown in Fig. 5(b). The speckle pattern to illuminate object results from a phase mask located in the front plane of the object. The simulated parameters are listed as: (1) the image size is 1.24×1.24  mm2 (400×400  pixels); (2) λ=532  nm; (3) the receiving plane is placed in the back plane of the object (Z0=20  mm, d=1  mm); (4) the recording number N is set as 5, 8, 11, and 13; and (5) the phase mask with a range of 0 to 1.5π is located upstream 30 mm from the object.

Fig. 5

The retrieval of MDPR and MDPRF with Z0=20  mm and d=1  mm under speckle illumination: (a) the object image, (b) the speckle pattern to illuminate object, (c) the convergence curve of MDPR and MDPRF with 5, 8, 11, and 13 diffraction patterns, and (d)–(g) and (h)–(k) are reconstructed images by MDPR and MDPRF with 11 diffraction patterns after 20, 50, 80, and 100 iterations, respectively.

JBO_23_1_016015_f005.png

The convergence curve is shown in Fig. 5(c), and the reconstructed images with 11 intensity patterns are given in Figs. 5(d)5(g) for MDPR and Figs. 5(h)5(k) for MDPRF. To our surprise, even in the case of speckle field, weighted feedback still functions well in the acceleration of the convergence. Similar to partially coherent illumination, the recording number >11 is a maximum limit for acceleration. As shown in Figs. 5(d)5(k), MDPRF retrieves a full object merely in 50 iterations, in which the convergence speed is enhanced by twofolds of magnitude.

4.

Experiment

4.1.

Lensless Multidistance Imaging

To prove the capability of our method, we set up the experiment in both lensless and lens-based systems. For lensless imaging, a programmable LED matrix (Adafruit 607) is used to realize partially coherent illumination. Unlike the regional illumination schemes in Refs. 3435.36, only one LED is switched on in experiment so as to prevent the diffraction patterns from alias. A beam of spherical wave from LED is incident on a condense lens to generate plane wave illumination. This parallel light shaped by aperture illuminates the sample so that the CCD camera (3.1  μm, Point Gray) receives a diffraction pattern in the downstream of the sample. For MDPR, the CCD camera is mounted on the precision linear stage (M-403, Physik Instrumente Inc.). As the stage moves, a set of diffraction patterns is recorded with different transverse diffractive distances (initial distance Z0 and equally spaced interval d). Choosing a proper transverse distance, MWRR is doable by changing the wavelength of LED. The details of the experimental setup are shown in Fig. 6. Here, this lensless implementation is able to accomplish the experiment of partially coherent illumination for MDPR and MWPR, simultaneously.

Fig. 6

The experimental schematic of lensless imaging under partially coherent illumination.

JBO_23_1_016015_f006.png

To verify the performance of partially coherent illumination, we perform MDPR with a sample of Negative Resolution Chart (R2L2S1N, Thorlabs) by LED and a fiber laser. The corresponding results are shown in Fig. 7. The experimental parameters are listed as follows: (1) the imaging size is 1800×1800  pixels; (2) N=11, Z0=29  mm, d=1  mm; and (3) the wavelength of the fiber laser is 532 nm and LED is 623 nm. After 100 iterations for the fiber laser and 50 iterations for LED, the reconstructed images are shown in Figs. 7(a)7(c), which indicates that partial coherency of incident light doubtlessly eliminates the affection of speckle noise. To quantitatively show this improvement, the plotlines along blue dash lines in Figs. 7(a)7(c) are drawn in Figs. 7(d) and 7(e). Note that the vertical fringe pattern is clearly resolved by weighted feedback with a high imaging contrast.

Fig. 7

The comparison of reconstructed images under coherent and partially coherent illumination: (a) fiber laser illumination (532 nm), (b) LED illumination (623 nm), (c) LED illumination with weighted feedback operation, (d) the plotline along the dash line in (a), and (e) the plotlines along the dash lines in (b) and (c). The white bars in (a)–(c) correspond to 150  μm.

JBO_23_1_016015_f007.png

Similarly, we image “orchid root” (NSS Ltd.) with 623-nm LED illumination and its retrieved complex amplitudes are displayed in Fig. 8. MDPR is run by 50 and 500 iterations to generate reconstructed amplitudes [Figs. 8(a) and 8(b)] and phases [Figs. 8(d) and 8(e)]. By plugging in weighted feedback and running 50 iterations, the amplitude and phase from MDPRF are obtained in Figs. 8(c) and 8(f), respectively. Comparing Figs. 8(a) and 8(c), it is noted that the vague shape of the specimen is removed by MDPRF, which accords with simulation analysis. The profiles along blue, red, and black arrows in Figs. 8(a)8(c) are plotted in Fig. 8(g). The plotline of MDPRF by 50 iterations is close to the line of MDPR by 500 iterations, which implies that weighted feedback actually speeds up the convergence. For phase reconstruction, it is worth noting that the contrast of the reconstructed phase is enhanced for MDPRF, which enables the biological tissue more distinct with respect to background noise.

Fig. 8

The reconstruction of orchid root by MDPR and MDPRF with the LED’s wavelength of 623 nm. (a), (b) and (d), (e) are retrieved amplitudes and phases after 50 and 500 iterations for MDPR, respectively, (c) and (f) are reconstructed by MDPRF after 50 iterations, and (g) plotlines along blue, red, and black arrow in (a)–(c) respectively. The black bars in (a)–(c) correspond to 600  μm.

JBO_23_1_016015_f008.png

4.2.

Lensless Multiwavelength Imaging

The advantage of LED matrix lies in that it could exchange wavelength by programmable operation without any extra mechanical devices. MWPR’s experimental implementation is the same as done in Fig. 6. Choosing a proper transverse distance Z0, three appreciable diffraction patterns are measured by sequentially switching red, green, and blue channels of LED (623, 532, and 467 nm). When Z0=29  mm, the reconstructed results of orchid root (NSS Ltd.) by MWPR and MWPRF are presented in Fig. 9.

Fig. 9

The reconstructed phases of orchid root by MWPR and MWPRF with the LED’s wavelength of 623 nm: (a) peak position distribution of cross correlation, (b) and (d) are wrapped retrieved phases for MWPR and MWPRF, respectively, and (c) and (e) are unwrapped ones. The black bars in (b) and (d) correspond to 600  μm.

JBO_23_1_016015_f009.png

Technically, the red, green, and blue LED channels are fabricated side by side in Adafruit 607, which leads to a relative shift for multiwavelength diffraction patterns. The relative position distribution is derived from cross-correlation operation.27 The detailed process should follow: (1) choosing the peak position of self-correlation of the diffraction pattern related to blue LED as a start position and (2) orderly use cross correlation of start image and others to calculate the relative shifts. Thus, the relative position distribution of three patterns is shown in Fig. 9(a). To accomplish MWPR, we align these patterns to the start position and cut out irrelevant parts. After this preparation, the reconstructed images of MWPR and MWPRF are shown in Figs. 9(b)9(e). The retrieved phases in Figs. 9(b) and 9(d) encounter the problem of phase wrapping. Here, the DCT least-squares algorithm42,43 is applied for phase unwrapping, and its corresponding unwrapped phases are presented in Figs. 9(c) and 9(e). Note that the contrast of retrieved phase is strengthened by weighted feedback. However, the imaging quality of multiwavelength strategy is not as good as Fig. 8. This discrepancy is mainly attributed to the uncertainty of central wavelength. We propose two solutions to solve this problem. The direct solution is that introducing different narrow-bandwidth filters in the back plane of the condensed lens to cut out uncertain parts. But the plug-in of the filter could lead to the decrease of outgoing radiance onto the CCD camera, which will heavily impair the quality of the reconstructed image. It is accordingly imperative to combine this weighted feedback modality with the noise compression method. For another solution, replacing the present LED with a high-power one, it is easy to get rid of the obstruction of low signal-to-noise ratio. But the incoherent property of this light source will lose the effectiveness of distance-based angular spectrum propagation. Hence, it is workable to place the recording plane in the far-field regime for IPR. We believe that this challenge will be overcome in the future.

4.3.

Lensless Imaging Through Scatter Layer

Optical imaging through a scattering medium has great promise for biomedical engineering, since biological tissue could diffuse any incident beams into a speckle pattern so that the resolution and penetration are limited.41 IPR, as a useful tool, could recover a target hidden behind the scatter medium. Here, we apply our weighted feedback acceleration in this situation and compare MDPRF with MDPR. The experimental diagram is shown in Fig. 10(a).

Fig. 10

The image reconstruction through scatter layer by MDPR and MDPRF: (a) the experimental schematic, (b)–(e) and (f)–(i) are reconstructed by MDPR and MDPRF algorithm after 10, 25, 50, and 100 iterations, respectively. The white bars in (b)–(i) correspond to 600  μm.

JBO_23_1_016015_f010.png

A fiber laser with the wavelenght of 532 nm takes the task of illumiantion. A ground glass diffuser (GGD, Thorlabs, 120 grit) is chosen as the scattering medium and placed in the middle of the CCD camera and sample. The sample is a number “5” of Negative 1951 USAF Target (R3L3S1N, Thorlabs). The experimental parameters are listed as follows: (1) the imaging size is 1700×1700  pixels; (2) the distance ZSG from the sample to GGD is 40 mm; and (3) N=11, Z0=27  mm, and d=1  mm. The corresponding retrieved images are shown in Figs. 10(b)10(e) for MDPR and Figs. 10(f)10(i) for MDPRF after 10, 25, 50, and 100 iteartions. It is noted that weighted feedback still works well in the speckle field. In only 25 iterations, MDPRF is capable of retrieving the structure of Only target, while MDPR needs 100 iterations or more. This result is identical to simulation analysis in Sec. 3.

4.4.

Lens-Based Imaging

At present, the resolution of lensless imaging is limited by finite pixel size of the imaging sensor. To observe the performance of our method on tiny matters, we apply weighted feedback into microscopy (magnification 20×, NA=0.5) and utilize translucent “human cheek cells” as the sample. Here, the MDPR algorithm is performed to synthesize a set of defocused intensity images for the phase reconstruction of an in-focused image. The corresponding datasets are from Laura Waller’s team.44

The wavelength of incident light is filtered by white light as 650 nm (10-nm bandwidth). The number of recorded images is 129 (one focused image and 128 defocused images, 1024×1024  pixels). The defocus range belongs to [256, 256  μm] and the interval is 4  μm. The focused intensity image is assigned as an initialization for MDPR. Choosing a set of defocused intensity images at the front/back of the focused plane, the reconstructed phases are obtained by iterative back-and-forth computation and amplitude replacement. The corresponding results are shown in Fig. 11. Within five recorded images (propagating distance: 8,4, 0, 4, and 8  μm), the retrieved phases under 10, 100, and 1000 iterations are shown in Figs. 11(a)11(f). For weighted feedback operation, speeding up convergence is easy to be discerned. At the same iterations, the imaging quality of MDPRF is superior to MDPR. Within 129 recorded images, the results of two methods under 10 iterations are given in Figs. 11(g)11(h). It is noted that the cells are successfully reconstructed by the two methods, and the imaging contrast of MDPRF is higher than MDPR’s. To further exhibit this improvement, the plotlines of Figs. 11(g) and 11(h) are shown in Fig. 11(i), which indicates that the edge enhancement of cells enable itself to be sharper. For low recorded images and short intervals, weighted feedback ensures high-accuracy reconstruction for MDPR.

Fig. 11

The phase reconstruction of translucent human cheek cells by MDPR and MDPRF under partially coherent illumination: (a)–(c) and (d)–(f) are obtained by five recorded images after 10, 100, and 1000 iterations, (g) and (h) are done by 129 recorded images after 10 iterations, and (i) plotlines along red and blue arrows in (g) and (h). The black bars in (a) and (d) correspond to 30  μm.

JBO_23_1_016015_f011.png

5.

Conclusion

We expand the application of weighted feedback operation into partially coherent illumination and speckle illumination. MDPR and MWPR algorithms are modified into weighted modality, MDPRF and MWPRF algorithms. In simulation, it is proved that these modified methods have the ability of speeding up convergence in partially coherent and speckle field. In experiment, a programmable LED matrix is used to form lensless multidistance and multiwavelength imaging systems. Compared with conventional fiber laser illumination, partial coherency of a light source actually makes imaging quality better. Using weighted feedback to retrieve the resolution chart and orchid root, the imaging contrast and convergence speed are highly enhanced for both lensless imaging strategies. Furthermore, our method also functions well in the optical imaging through scattering medium. Similarly, the MDPR algorithm and its weighted version MDPRF algorithm are applied in microscopy to image translucent human cheek cells, which demonstrates that weighted feedback not only enhances the convergence speed but also strengthens the phase contrast for the translucent sample.

This work provides an effective strategy to perform high-contrast imaging for IPR method. Also, due to fast and accurate convergence, weighted feedback could heavily decrease measurement times, which enables the experimental setup at low cost and compactness for label-free biological imaging.

Disclosures

No conflicts of interest, financial or otherwise, are declared by the authors.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Nos. 61377016, 61575055, and 61575053), the Fundamental Research Funds for the Central Universities (No. HIT.BRETIII.201406), the Program for New Century Excellent Talents in University (No. NCET-12-0148), and the China Postdoctoral Science Foundation (Nos. 2013M540278 and 2015T80340), and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China. The authors thank Mr. Cheng Shen for polishing the English.

References

1. 

B. Bhaduri et al., “Diffraction phase microscopy: principles and applications in materials and life sciences,” Adv. Opt. Photonics, 6 57 –119 (2014). http://dx.doi.org/10.1364/AOP.6.000057 AOPAC7 1943-8206 Google Scholar

2. 

H. Majeed et al., “Quantitative phase imaging for medical diagnosis,” J. Biophotonics, 10 (2), 177 –205 (2017). http://dx.doi.org/10.1002/jbio.201600113 Google Scholar

3. 

W. Osten et al., “Recent advances in digital holography,” Appl. Opt., 53 (27), G44 –G63 (2014). http://dx.doi.org/10.1364/AO.53.000G44 APOPAI 0003-6935 Google Scholar

4. 

Y. Shechtman et al., “Phase retrieval with application to optical imaging: a contemporary overview,” IEEE Signal Process. Mag., 32 87 –109 (2015). http://dx.doi.org/10.1109/MSP.2014.2352673 ISPRE6 1053-5888 Google Scholar

5. 

E. McLeod and A. Ozcan, “Unconventional methods of imaging: computational microscopy and compact implementations,” Rep. Prog. Phys., 79 076001 (2016). http://dx.doi.org/10.1088/0034-4885/79/7/076001 RPPHAG 0034-4885 Google Scholar

6. 

G. Zheng, R. Horstmeyer and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics, 7 (9), 739 –745 (2013). http://dx.doi.org/10.1038/nphoton.2013.187 NPAHBY 1749-4885 Google Scholar

7. 

S. Pacheco, G. Zheng and R. Liang, “Reflective Fourier ptychography,” J. Biomed. Opt., 21 (2), 026010 (2016). http://dx.doi.org/10.1117/1.JBO.21.2.026010 JBOPFO 1083-3668 Google Scholar

8. 

J. Sun et al., “Resolution-enhanced Fourier ptychographic microscopy based on high-numerical-aperture illuminations,” Sci. Rep., 7 1187 –1197 (2017). http://dx.doi.org/10.1038/s41598-017-01346-7 SRCEC3 2045-2322 Google Scholar

9. 

R. Horstmeyer et al., “Diffraction tomography with Fourier ptychography,” Optica, 3 (8), 827 –835 (2016). http://dx.doi.org/10.1364/OPTICA.3.000827 Google Scholar

10. 

L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Opitca, 2 (2), 104 –111 (2015). http://dx.doi.org/10.1364/OPTICA.2.000104 Google Scholar

11. 

Y. Yao et al., “Ptychographic phase microscope based on high-speed modulation on the illumination beam,” J. Biomed. Opt., 22 (3), 036010 (2017). http://dx.doi.org/10.1117/1.JBO.22.3.036010 JBOPFO 1083-3668 Google Scholar

12. 

A. Anand, V. Chhaniwal and B. Javidi, “Quantitative cell imaging using single beam phase retrieval method,” J. Biomed. Opt., 16 (6), 060503 (2011). http://dx.doi.org/10.1117/1.3589090 JBOPFO 1083-3668 Google Scholar

13. 

M. R. Teague, “Deterministic phase retrieval: a Green’s function solution,” J. Opt. Soc. Am., 73 1434 –1441 (1983). http://dx.doi.org/10.1364/JOSA.73.001434 JOSAAH 0030-3941 Google Scholar

14. 

L. Waller, L. Tian and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express, 18 (12), 12552 –12561 (2010). http://dx.doi.org/10.1364/OE.18.012552 OPEXFF 1094-4087 Google Scholar

15. 

C. Zuo et al., “High-resolution transport-of-intensity quantitative phase microscopy with annular illumination,” Sci. Rep., 7 7654 (2017). http://dx.doi.org/10.1038/s41598-017-06837-1 SRCEC3 2045-2322 Google Scholar

16. 

R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of phase from image and diffraction plane pictures,” Optik, 35 237 –246 (1972). OTIKAJ 0030-4026 Google Scholar

17. 

J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt., 21 (15), 2758 –2769 (1982). http://dx.doi.org/10.1364/AO.21.002758 APOPAI 0003-6935 Google Scholar

18. 

A. Greenbaum et al., “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med., 6 267ra175 (2014). http://dx.doi.org/10.1126/scitranslmed.3009850 STMCBQ 1946-6234 Google Scholar

19. 

S. Dong et al., “High-resolution fluorescence imaging via pattern-illuminated Fourier ptychography,” Opt. Express, 22 (17), 20856 –20870 (2014). http://dx.doi.org/10.1364/OE.22.020856 OPEXFF 1094-4087 Google Scholar

20. 

T. M. Godden et al., “Phase calibration target for quantitative phase imaging with ptychography,” Opt. Express, 24 (7), 7679 –7692 (2016). http://dx.doi.org/10.1364/OE.24.007679 OPEXFF 1094-4087 Google Scholar

21. 

J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett., 85 (20), 4795 –4797 (2004). http://dx.doi.org/10.1063/1.1823034 APPLAB 0003-6951 Google Scholar

22. 

A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy, 109 (10), 1256 –1262 (2009). http://dx.doi.org/10.1016/j.ultramic.2009.05.012 ULTRD6 0304-3991 Google Scholar

23. 

W. Yu et al., “High-quality image reconstruction method for ptychography with partially coherent illumination,” Phys. Rev. B, 93 (24), 241105 (2016). http://dx.doi.org/10.1103/PhysRevB.93.241105 Google Scholar

24. 

G. Pedrini, W. Osten and Y. Zhang, “Wave-front reconstruction from a sequence of interferograms recorded at different planes,” Opt. Lett., 30 (8), 833 –835 (2005). http://dx.doi.org/10.1364/OL.30.000833 OPLEDP 0146-9592 Google Scholar

25. 

Z. Liu et al., “Iterative phase amplitude retrieval from multiple images in gyrator domains,” J. Opt., 17 (2), 025701 (2015). http://dx.doi.org/10.1088/2040-8978/17/2/025701 Google Scholar

26. 

C. Shen et al., “Two noise-robust axial scanning multi-image phase retrieval algorithms based on Pauta criterion and smoothness constraint,” Opt. Express, 25 (14), 16235 –16249 (2017). http://dx.doi.org/10.1364/OE.25.016235 OPEXFF 1094-4087 Google Scholar

27. 

C. Guo et al., “Axial multi-image phase retrieval under tilt illumination,” Sci. Rep., 7 7562 (2017). http://dx.doi.org/10.1038/s41598-017-08045-3 SRCEC3 2045-2322 Google Scholar

28. 

P. Bao et al., “Phase retrieval using multiple illumination wavelengths,” Opt. Lett., 33 (4), 309 –311 (2008). http://dx.doi.org/10.1364/OL.33.000309 OPLEDP 0146-9592 Google Scholar

29. 

D. W. E. Noom, K. S. E. Eikema and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett., 39 (2), 193 –196 (2014). http://dx.doi.org/10.1364/OL.39.000193 OPLEDP 0146-9592 Google Scholar

30. 

X. Pan, C. Liu and J. Zhu, “Single shot ptychographical iterative engine based on multi-beam illumination,” Appl. Phys. Lett., 103 (17), 171105 (2013). http://dx.doi.org/10.1063/1.4826273 APPLAB 0003-6951 Google Scholar

31. 

J. A. Rodrigo et al., “Wavefield imaging via iterative retrieval based on phase modulation diversity,” Opt. Express, 19 (19), 18621 –18635 (2011). http://dx.doi.org/10.1364/OE.19.018621 OPEXFF 1094-4087 Google Scholar

32. 

L.W. Whitehead et al., “Diffractive imaging using partially coherent x rays,” Phys. Rev. Lett., 103 243902 (2009). http://dx.doi.org/10.1103/PhysRevLett.103.243902 PRLTAO 0031-9007 Google Scholar

33. 

Z. Jingshan et al., “Partially coherent phase imaging with simultaneous source recovery,” Biomed. Opt. Express, 6 (1), 257 –265 (2015). http://dx.doi.org/10.1364/BOE.6.000257 BOEICL 2156-7085 Google Scholar

34. 

L. Tian et al., “Multiplexed coded illumination for Fourier ptychography with an LED array microscope,” Biomed. Opt. Express, 5 (7), 2376 –2389 (2014). http://dx.doi.org/10.1364/BOE.5.002376 BOEICL 2156-7085 Google Scholar

35. 

M. Chen, L. Tian and L. Waller, “3D differential phase contrast microscopy,” Biomed. Opt. Express, 7 (10), 3940 –3950 (2016). http://dx.doi.org/10.1364/BOE.7.003940 BOEICL 2156-7085 Google Scholar

36. 

D. Lee et al., “Color-coded LED microscopy for multi-contrast and quantitative phase-gradient imaging,” Biomed. Opt. Express, 6 (12), 4912 –4922 (2015). http://dx.doi.org/10.1364/BOE.6.004912 BOEICL 2156-7085 Google Scholar

37. 

C. Guo et al., “A fast-converging iterative method via weighted feedback for multi-distance diffractive imaging,” Sci. Rep., (2017). Google Scholar

38. 

D. Voelz, Computational Fourier Optics: A MATLAB Tutorial, SPIE Press, Bellingham, Washington (2011). Google Scholar

39. 

A. Anand et al., “Wavefront sensing with random amplitude mask and phase retrieval,” Opt. Lett., 32 (11), 1584 –1586 (2007). http://dx.doi.org/10.1364/OL.32.001584 OPLEDP 0146-9592 Google Scholar

40. 

A. K. Singh et al., “Scatter-plate microscope for lensless microscopy with diffraction limited resolution,” Sci. Rep., 7 10687 (2017). http://dx.doi.org/10.1038/s41598-017-10767-3 SRCEC3 2045-2322 Google Scholar

41. 

O. Katz et al., “Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations,” Nat. Photonics, 8 (10), 784 –790 (2014). http://dx.doi.org/10.1038/nphoton.2014.189 NPAHBY 1749-4885 Google Scholar

42. 

M. A. Schofield and Y. Zhu, “Fast phase unwrapping algorithm for interferometric applications,” Opt. Lett., 28 (14), 1194 –1196 (2003). http://dx.doi.org/10.1364/OL.28.001194 OPLEDP 0146-9592 Google Scholar

43. 

W. Shi, Y. Zhu and Y. Yao, “Discussion about the DCT/FFT phase-unwrapping algorithm for interferometric applications,” Optik, 121 1443 –1449 (2010). http://dx.doi.org/10.1016/j.ijleo.2009.02.006 OTIKAJ 0030-4026 Google Scholar

44. 

Z. Jingshan et al., “Transport of Intensity phase imaging by intensity spectrum fitting of exponentially spaced defocus planes,” Opt. Express, 22 (9), 10661 –10674 (2014). http://dx.doi.org/10.1364/OE.22.010661 OPEXFF 1094-4087 Google Scholar

Biography

Cheng Guo is currently a PhD student in the Department of Automatic Test and Control, Harbin Institute of Technology, under the supervision of Professor Zhengjun Liu. His research focuses on the development and application of iterative phase retrieval methods.

Qiang Li is currently a master’s student in the Department of Automatic Test and Control, Harbin Institute of Technology, under the supervision of Professor Jian Liu. His research mainly focuses on computational photography and image processing.

Xiaoqing Zhang is currently a PhD student at the School of Biological Science and Technology, Harbin Institute of Technology, under the supervision of Professor Huan Nie. His research focuses on glycomics.

Jiubin Tan is the head of Precision Instrument Engineering School, Harbin Institute of Technology. He received a PhD from Harbin Institute of Technology in 1991. He is academician of the Chinese Academy of Engineering. He is also a standing committee member of the International Committee on Measurements and Instrumentation, the chairman of the China Measuring Instrument Specialty Committee, the managing director of the China Instrument and Control Society, and the managing director of the Chinese Society for Measurement.

Shutian Liu is a professor in the Department of Physics, Harbin Institute of Technology. He has published more than 200 peer-reviewed journal articles in the field of optics and 1 book. His current research interests include optical information processing, optical information security, nonlinear optics, and quantum optics. He is a senior member of the Optical Society of America (OSA) and a fellow of the Chinese Physical Society.

Zhengjun Liu is a professor in the Department of Automatic Test and Control, Harbin Institute of Technology, China. He was honored by the Program for New Century Excellent Talents in University in 2012. He has published 97 peer-reviewed journal articles in the field of optics, 2 books, and 1 book chapter. He is a senior member of OSA and a member of IEEE. His current research interests include optical image processing and super-resolution imaging.

© 2018 Society of Photo-Optical Instrumentation Engineers (SPIE) 1083-3668/2018/$25.00 © 2018 SPIE
Cheng Guo, Qiang Li, Xiaoqing Zhang, JiuBin Tan, Shutian Liu, and Zhengjun Liu "Enhancing imaging contrast via weighted feedback for iterative multi-image phase retrieval," Journal of Biomedical Optics 23(1), 016015 (31 January 2018). https://doi.org/10.1117/1.JBO.23.1.016015
Received: 29 September 2017; Accepted: 10 January 2018; Published: 31 January 2018
Lens.org Logo
CITATIONS
Cited by 7 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Diffraction

Phase retrieval

Speckle

Reconstruction algorithms

Light emitting diodes

Imaging systems

Fiber lasers

RELATED CONTENT


Back to Top