Open Access
14 June 2017 Fourier ptychographic microscopy using wavelength multiplexing
Author Affiliations +
Funded by: National Natural Science Foundation of China (NSFC), National Nature Science Foundation of China, Natural Science Foundation of China, National Science Foundation China, Beijing Key Laboratory of Multidimension and Multiscale Computational Photography (MMCP), Directorate for Engineering, Division of Chemical, Bioengineering, Environmental, and Transport Systems (CBET)
Abstract
Fourier ptychographic microscopy (FPM) is a recently developed technique stitching low-resolution images in Fourier domain to realize wide-field high-resolution imaging. However, the time-consuming process of image acquisition greatly narrows its applications in dynamic imaging. We report a wavelength multiplexing strategy to speed up the acquisition process of FPM several folds. A proof-of-concept system is built to verify its feasibility. Distinguished from many current multiplexing methods in Fourier domain, we explore the potential of high-speed FPM in spectral domain. Compatible with most existing FPM methods, our strategy provides an approach to high-speed gigapixel microscopy. Several experimental results are also presented to validate the strategy.

1.

Introduction

Comprehensive understanding and statistical analysis have been a major trend in various biomedical applications, such as cell division,13 tumor metastasis,46 and vesicular transport.79 This brings an urgent demand for high-throughput data acquisition, especially for high-speed gigapixel imaging in microscopy. However, in a common microscopy system, the space–bandwidth product (SBP) is fundamentally limited to several megapixels. An objective lens with a large SBP is hard and expensive to produce. To meet the requirement of high-throughput, many methods1014 have been proposed. However, the data acquisition speed of these methods remains a big challenge. This poses a great limitation on many in-vivo applications with rapid changes.

Among these methods,1014 Fourier ptychographic microscopy (FPM)11 is a recently developed technique for wide-field high-resolution imaging. By simply replacing the traditional illumination with a programmable light-emitting diode (LED) array, FPM introduces multiangle coherent illumination to a conventional microscope for larger SBP. The sample is successively illuminated by LEDs with different incident angles, and a sequence of images is collected correspondingly. As each captured image contains the information of different spatial frequencies, the sequential images can be stitched together in Fourier domain by applying a phase retrieval algorithm.11,15,16 FPM can achieve gigapixel imaging without mechanical scanning and can greatly expand the SBP of an objective lens with a low numerical aperture (NA).

The main limitation of the FPM technique is its time-consuming process for data acquisition as aforementioned. Specifically, the original FPM system11 spends several minutes in collecting 200 low-resolution images to recover the high-resolution complex field of the sample. This limitation makes FPM hardly able to observe dynamic samples and greatly narrows its applications in biological imaging. The methods14,1722 proposed to solve this problem can mainly be classified into two categories. The one is to reduce the redundant sampling,1720 and the other is to perform information multiplexing in Fourier domain.14,21,22 However, for the multiplexing approaches, the inverse problem of separating the aliased information within a single image is underdetermined. Therefore, the multiplexing ability is restricted, and the rapid data acquisition of FPM is still hard to implement. Specifically, previous multiplexing approaches21,22 can only shorten the acquisition time from several minutes to tens of seconds. Tian et al.14 realized the observation of dynamic in-vitro samples with high-power LEDs and a better control circuitry to fasten the synchronization between the LED array and the sCMOS camera, but it still has much room for improvement.

In this paper, we propose a wavelength multiplexing strategy of FPM. As different wavelengths pass independently through the optical elements, wavelength multiplexing can considerably enlarge the transmission capacity.23 Wavelength multiplexing has been used in the field of optics for a long time, not only in wavelength-division multiplexed optical communications24,25 and spectrally encoded endoscopy systems2628 but also in ultrafast real-time optical imaging29 and single-shot ultrafast tomographic imaging.30 This study is the first time wavelength multiplexing is adapted to FPM to speed up its data acquisition process. We term it wavelength multiplexed Fourier ptychographic microscopy (WMFPM).31 Our strategy bypasses the limited optical SBP of the objective lens. It makes the techniques of high-speed gigapixel imaging in macroscenes32 able to be applied in microscopy with traditional objective lens. In the WMFPM strategy, we utilize different wavelengths to label different spatial frequencies of the sample. Both the process of data acquisition and the reconstruction algorithm are redesigned to reduce the total number of illuminations to its 1/N, where N equals to the number of wavelengths. A proof-of-concept prototype system is built to validate the strategy. Several experimental results, including the resolution chart and kidney cells, are presented. Numerical simulations are also presented for the performance analysis. In addition, the multiplexing ability of the proposed WMFPM might be further improved theoretically as it is compatible with most multiplexing techniques in Fourier domain.14,21,22

2.

Methods

2.1.

Framework of Wavelength Multiplexed Fourier Ptychographic Microscopy

The comparison between the WMFPM strategy and the original FPM is shown in Fig. 1. An objective lens can be viewed as a low-pass filter in Fourier domain due to its limited NA. The original FPM method expands the NA of an objective lens using multiangle illumination. As such, the optical resolution is improved. Since an incident beam from a specific LED makes a certain lateral shift of the sample’s Fourier plane, the image sequence captured under various angular illuminations represents a temporal scanning in Fourier domain, as shown in Fig. 1(a).

Fig. 1

Extension from the original FPM to WMFPM. Specifically, (a) is the schematic of the original FPM and (b) is the schematic of WMFPM.

JBO_22_6_066006_f001.png

Distinguished from most multiplexed FPM techniques in Fourier domain,14,21,22 we explore the potential of information multiplexing for FPM in the spectral domain. The WMFPM framework uses multicolor LEDs, each of which has a narrowband wavelength coverage to simultaneously illuminate different circular regions in a sample’s Fourier plane, as shown in Fig. 1(b). It introduces a superposition of several shifts of the sample’s Fourier plane in each measurement, resulting in a reduction of the acquisition time. For multiplexed FPM techniques in Fourier domain,14,21,22 the inverse process to separate the multiplexed information aliased in a single image limits the multiplexing ability and requires more computation time for the optimization of the underdetermined inverse problem. To the contrary, our WMFPM strategy labels the spatial frequency information in different circular regions by separated narrowband spectral channels (or wavelengths). As the captured data are intrinsically not aliased in these nonoverlapping spectral channels, they can be easily separated through optical or imaging devices, such as diffraction gratings and RGB cameras, without the aforementioned limitations. More comparisons between previous multiplexed FPM techniques in Fourier domain14,21,22 and our WMFPM strategy will be mentioned in Sec. 4.3.

The WMFPM framework includes the following steps. First, the number of wavelengths in use is determined, and a sequence of illumination patterns with wavelength multiplexing is designed. An illumination pattern is a combination of several angular incident lights labeled by different wavelengths using a programmable multicolor LED array. The sequence of illumination patterns should ensure that a large area in Fourier domain of the sample can be covered completely. Second, the illumination patterns are lit up one by one, and an image sensor is used to capture corresponding multicolor data. Finally, the multicolor data are separated based on different wavelengths.

An assumption of our WMFPM strategy is that the responses of many biological samples are nearly the same across different spectral channels, except for the wavelength-dependent resolution and the chromatic defocus aberration. Otherwise, the correspondence between spectral domain and Fourier domain will be no longer valid. Fortunately, a thin label-free phase object often satisfies this assumption. Previous studies3336 have demonstrated phase retrieval techniques by tuning multiwavelength illumination for nondispersive objects. Even a phase imaging method under a white-light source has been proposed.37 These methods3337 assume that the specimen maintains similar transmission properties across a large bandwidth. In our strategy, the differences of wavelength-dependent resolution vary the sizes of the circular regions in the Fourier plane. The chromatic defocus aberration, caused by the different propagation speeds of different wavelengths through the optical system, can be calibrated by introducing a phase-shift factor.

2.2.

Representative Realization of Wavelength Multiplexed Fourier Ptychographic Microscopy

To validate the capability of the WMFPM, we build a proof-of-concept prototype system, called red, green, and blue (RGB) multiplexed FPM (RGB-MFPM). The schematic of the system is shown in Fig. 2(a). It is a simplified but representative realization of the aforementioned WMFPM strategy, which accomplishes threefold reduction of the acquisition time. We utilize an LED array with three colors (RBG) to realize the “line pattern” illumination shown in Fig. 2(b). An RGB camera is used to collect the data labeled by three colors simultaneously. Figure 2(c) is a photograph showing the prototype system. The three-color (RGB) illumination enables the use of a common RGB camera instead of three monochromatic cameras with narrowband color filters. The RGB-camera system just requires relatively minor modification of the off-the-shelf FPM system11 without adding extra optical devices. To the contrary, a monochromatic-camera system needs more cameras and additional light paths for beam splitting and wavelength filtering. What needs to be noted here is that the intrinsic drawbacks (such as spectral cross talk, chromatic defocus aberration, and equivalent downsampling by a Bayer filter) introduced by the RGB camera can all be eliminated in our strategy. These solutions will be specifically discussed in Secs. 2.2.1, 2.2.4, and 4.1, respectively.

Fig. 2

The RGB-MFPM system: (a) is the schematic of RGB-MFPM and (b) is the 15×5 line pattern illumination. Three color LEDs are placed side-by-side in each line unit; (c) is the photograph showing the setup of RGB-MFPM. The close-up is one of the line unit, each LED unit is set to a certain color (red, green, or blue); (d) is the relative luminous intensity curve of LEDs and the quantum efficiency curve of the RGB camera (AVT Prosilica GT1290C).

JBO_22_6_066006_f002.png

Each LED unit is set to a certain color, as shown in the close-up of Fig. 2(c). According to the relative luminous intensity curve of LEDs and the quantum efficiency curve of the RGB camera shown in Fig. 2(d), we find that the LEDs have no spectral overlap while the RGB camera has almost independent responses to these different color illuminations. However, the green channel relatively has a high response (0.12 quantum efficiency) to blue LEDs, which may introduce the cross talk and therefore decrease the quality of reconstruction. The problem can be solved by a preliminary calibration step, which will be specifically discussed in Sec. 2.2.1. Alternatively, using blue LEDs with shorter wavelengths (e.g., with 420-nm centered wavelength) might avoid the spectral cross talk.

The data acquisition process of RGB-MFPM works as follows: first, we light up the adjacent three-color (RGB) LEDs in a “line unit” simultaneously, as shown in the left top of Fig. 2(b). Second, the 15×5 line patterns are sequentially lit up from left top to right bottom, as shown in Fig. 2(b). The corresponding Fourier space coverage is shown in the middle right of Fig. 2(a). Third, we use an RGB camera to capture a sequence of RGB images of the sample under each line unit illumination. An improved algorithm is designed for reconstruction. Before reconstruction, some preprocessing steps are conducted and wavelength-dependent parameters are also introduced as follows.

2.2.1.

Multiwavelength separation

As shown in Fig. 2(a), once a line unit is illuminated, we capture the multiplexed information of three different Fourier areas labeled by three different wavelengths. However, the spectral response of a color channel is not completely isolated from the others, which causes the cross talk problem as described earlier. In particular, the blue LED light will be detected by the green channel in our real setup. To mitigate the problem of color leakage or spectral cross talk, we use the color-leakage correction method.38 In the method, the signal measured in a color channel is expressed as the sum of the light of the desired color and the lights of other colors, which can be written as

Eq. (1)

[I˜RI˜GI˜B]=[βRRβGRβBRβRGβGGβBGβRBβGBβBB][IRIGIB],
where Im represents the real response of the m channel to the m-color LED light and I˜m is the signal (with color leakage) measured in the m channel of the detector, respectively (m=R, G, and B). The coefficient βnm represents the detector response of the m channel to the n-color LED light (m, n=R, G, and B) and can be easily measured by the m channel of the detector under the n-color LED light without the sample. Then, the real measurement of the light intensity at each color can be calculated as

Eq. (2)

[IRIGIB]=[βRRβGRβBRβRGβGGβBGβRBβGBβBB]1[I˜RI˜GI˜B].

By this method, we can separate an RGB image into three gray ones, as shown in the top of Fig. 2(a). Finally, 75 color images are extended to 225 gray ones, including 75 images under R/G/B LED illuminations, respectively.

2.2.2.

Contrast balance

A contrast gap exists in the separated R/G/B data because the illumination intensity, the transmittance of the optical devices, and the camera responses to R/G/B LEDs are all nonuniform. Thus, the contrast balance needs to be done by applying the normalization to three groups of measurements, respectively, which normalizes the values of images in each group to [0,1].

2.2.3.

Parameters calculation

The coherent optical transfer function in Fourier space of the objective lens is assumed as a circular pupil. The size (radius) and the central position of each pupil are both wavelength dependent, as shown in the middle right of Fig. 2(a). The radius equals NAobj×k0, where k0=2π/λ is the wavenumber in a vacuum. The central position is related to the incident angle of each LED. Specifically, a thin sample illuminated by an oblique plane wave with a wavevector (kx,ky) is equivalent to a (kx,ky) shift of the sample’s central spectrum in Fourier domain, so the central position can be calculated as

Eq. (3)

(kxi,kyi)=2πλ(xcxis,ycyis),
with s=(xcxi)2+(ycyi)2+h2. Here, λ refers to the central wavelength of illumination, (xc,yc) is the central position of the sample’s image plane, (xi,yi) is the position of the i’th LED, and h is the distance between the LED array and the sample.

2.2.4.

Chromatic defocus calibration

Due to different propagation speeds of multiwavelength lights through samples or lenses, a wavelength-dependent phase shift (chromatic defocus aberration) will occur. Therefore, we cannot simultaneously capture in focus R/G/B data. In implementation, we record the chromatic defocus aberration and add a wavelength-dependent phase-shift factor into the reconstruction steps using the following equation:

Eq. (4)

ei·φ(kx,ky)=ei(2π/λ)2kx2ky2·z0,
where kx2+ky2<(NAobj·2π/λ)2. Here, (kx,ky) refers to the aforementioned position shift of the sample’s spectrum center, which is caused by the incident angle of the illumination. NAobj is the numerical aperture of the objective lens, and z0 is the estimated defocus skewing.

2.2.5.

Phase retrieval algorithm

Generally, the sample’s phase Δϕ is defined as 2π·n·d/λ, where n is the refractive index, d is the thickness of the sample, and λ is the wavelength. We convert the phase distribution into a thickness profile (or optical path length, OPL=n·d) to avoid the wavelength-dependent problem.35 During reconstruction, several modifications are made based on the embedded pupil function recovery algorithm.16 As shown in Fig. 3, the flowchart demonstrates the main recovery procedures of RGB-MFPM. After the multiwavelength separation and the contrast balance, the high-resolution complex field of the sample can be recovered by incorporating the wavelength-dependent parameters and phase-shift factors into the recovery algorithm. Compared with traditional FPM, we can achieve about 3 times reduction in acquisition time.

Fig. 3

The recovery procedures of RGB-MFPM. The flowchart illustrates the algorithm of RGB-MFPM based on line pattern illumination.

JBO_22_6_066006_f003.png

3.

Results

3.1.

Experimental Setup

For the prototype system, we use a three-color LED array with the layout of the line pattern in Fig. 2(b) to illuminate the sample, an RGB camera (AVT Prosilica GT1290C, which has the pixel size of 3.75  μm, the pixel number of 1280(H)×960(V), and the bit depth of 12 bit for color model) to capture the raw data, and a low NA objective (NA=0.1) for observation. The central wavelengths of LEDs are 632, 532, and 472 nm, and their bandwidths are all 20  nm. The defocus abberation of the R/G/B channels is estimated in each setup, and corresponding phase-shift factors are then applied to the reconstruction algorithm. To compare with the original FPM framework, we capture 75 low-resolution RGB images each time to reconstruct the high-resolution results.

3.2.

Validation of the Proposed Approach

As shown in Fig. 4, a USAF-1951 resolution chart is imaged to show the resolution improvement based on the prototype system. Figure 4(a) shows the captured Fourier area (three circles) in one-shot of our WMFPM method compared with the single circle of the original FPM. In other words, the original method needs to scan all the overlapped circles one by one, but our method can scan three circles each time, which means 3 times improvement in the capability of optical throughput in each low-resolution measurement. One of the multicolor images and its separated one channel data are shown in Fig. 4(b). Figure 4(c) is the recovered high-resolution intensity image of RGB-MFPM, compared with the results of the original FPM in Figs. 4(d)4(f) with monochromatic LEDs. By the way, the results [both the USAF1951 in Figs. 4(d)4(f) and the kidney cells in Figs. 5(b)5(d)] of traditional FPM in this article are all captured using the same RGB camera for comparison. As a result, the recovered intensity with green light has the best result because the green channel has the maximum occupation ratio in the Bayer filter of the RGB camera. To the contrary, the recovered results with red and blue illumination have more artifacts.

Fig. 4

The experimental recovery of the USAF-1951 resolution chart. (a) shows the comparison of the one-shot sampling in the Fourier plane between our method (three circles) and the original FPM (1 circle); (b) are the raw data, including the raw color image (left), the separated gray image (middle), and its close-up (right); (c) is the recovered result using RGB-MFPM, including the recovered high-resolution intensity and the close-up; and (d), (e), and (f) are the single-color (red, green, and blue, respectively) results and their close-ups using the original FPM.

JBO_22_6_066006_f004.png

Fig. 5

The experimental recovery of kidney cells. (a) are the raw data of 75 RGB images (left), one of the RGB image and its separated gray image (middle), and the close-up of the gray image (right); (b), (c), and (d) represent the recovered high-resolution phase images (middle) and their close-ups (bottom) under single color illuminations, respectively; and (e) shows the recovered phase images using RGB-MFPM (middle) and the close-up of the phase result (bottom).

JBO_22_6_066006_f005.png

To be more specific, 255 images with high-dynamic range (HDR) take about 1.332×225  s (about 300 s for the acquisition in total) in the original FPM setup. By comparison, RGB-MFPM needs only 75 images, which take about 1.404×75  s (about 100 s for the acquisition in total). The time for data acquisition can be reduced to about 1/3 of the original method. Much more time can be reduced by combining with multiplexing methods in the Fourier plane in our framework. The resolution improvement is over 3.15 times from 4.92 to 1.56  μm (group 7 element 5 to group 9 element 3). In the result, not only 3 times reduction in acquisition time but also a better performance of intensity reconstruction can be achieved in our RGB-MFPM. With the comparison, we find that our method has fewer artifacts due to the higher light efficiency of simultaneous multicolor LED illumination.

3.3.

Experiments on Kidney Cells

The recovery of kidney cells and the comparison with the original FPM are shown in Fig. 5. The raw data of 75 color images of kidney cells captured by an RGB camera are shown in the left panel of Fig. 5(a); one of the low-resolution color images and its separated gray image (middle) and the close-up of the gray image (right) are also presented. Figures 5(b)5(d) are the recovered phase results by traditional monochromatic LEDs (successively, R/G/B light). Figure 5(e) is the recovered phase image of RGB-MFPM, together with the close-up of the recovery phase. As a phase sample, little information can be seen in the captured low-resolution intensity images except for some cell outlines. The slight differences among these phase recoveries in Fig. 5 arise mainly from the discrepancy of the wavelength-dependent resolutions. The intensity-only initial guess of the object’s high-resolution Fourier plane may not provide a good initial guess for the reconstruction algorithm, which causes a few artifacts in all phase results. More robust phase reconstruction might be achieved by using a phase solution based on differential phase contrast deconvolution as a close initial guess.14 From the experimental results, our method achieves comparable performance to the original FPM methods with only 1/3 acquisition time (75 images compared with 225 images).

4.

Discussion

4.1.

Influence of Sparse Sampling by Bayer Filter

Although the pixel size of the utilized RGB camera matches the optical resolution, the data captured by the RGB camera are actually downsampled due to the Bayer filter in front of the sensor. Fortunately, the data of FPM (also WMFPM) are highly redundant and have many overlaps in Fourier domain. This provides enough information for high-resolution recovery even based on the sparse data in spatial domain.17 The Bayer filter can be viewed as different spatial masks for R/G/B channels. The sparse sampling in spatial domain brought by these masks can be made up by the algorithm due to the large overlap (about 60%) in the frequency domain. The experimental results in Figs. 4 and 5 have verified the ability and accord with the results in the previous paper.17 In that paper, they have shown that FP algorithm is able to recover the complex image with a maximum of 70% of empty pixels (random mask), which has 0.01 root-mean-square error (RMSE). For RGB-MFPM, the sampling masks repeatedly switch among the red mask (75% empty pixels), green mask (50% empty pixels), and blue mask (75% empty pixels), with an average 66.7% empty pixels.

Here, we use a numerical simulation to further validate that the FPM method can overcome the resolution loss caused by the mosaic of the Bayer filter, as shown in Fig. 6. The comparison among different sparse sampling masks has been done in terms of the quantitative measurement of the image quality. The simulation parameters are the same as those in the experiment. With the recovered phase images in Fig. 6(a) and the RMSE curve in Fig. 6(b), we find that the recovery with the RGB-MFPM mask caused by the Bayer filter can still achieve reasonable good results. The missing spatial pixels are well compensated for by the large overlap in Fourier domain. Particularly, using our RGB-MFPM mask, we can get the phase image with 0.015 RMSE, as shown in Fig. 6(b). Figure 6(c) shows the different masks used in the simulation.

Fig. 6

The influence of sparse sampling by Bayer filter. (a) shows the phase recovery results with different masks, which indicate that the quality decreases correspondingly to the increase of empty pixel percents; (b) is the RMSE curve of different masks, and the upright triangle demonstrates that the phase recovery using our RGB-MFPM mask has a sustainable result with 0.015 RMSE; and (c) shows the random mask, color masks, and our RGB-MFPM mask.

JBO_22_6_066006_f006.png

Moreover, in a single-camera system, such as the RGB-MFPM system, we can also use a Foveon X3 sensor39,40 rather than a traditional RGB sensor to avoid the sparse sampling problem. The Foveon X3 sensor layers R/G/B sensors vertically rather than using a mosaic of separate detectors for three (RGB) colors.

4.2.

From Single-Camera System to Camera-Array-Based Wavelength Multiplexed Fourier Ptychographic Microscopy

As the multiplexing ability of WMFPM is equivalent to the number of different wavelengths in use, more multiwavelength LEDs can be used to get a greater reduction in acquisition time. To our knowledge, at least 13 kinds of LEDs with different wavelengths (or more precisely: narrowband nonoverlapping spectral channels) are commercially available. However, to avoid the cross talk between different channels, the RGB camera can only afford three channels within a single shot.

In addition, in the single camera system, such as RGB-MFPM, a trade-off exists between the number of wavelengths and the sampling resolution induced by the color filters (the Bayer filter here). Though the downsampled mask has little influence on the final performance of the reconstruction, it will cause some artifacts and reduce the light efficiency.

To make the WMFPM possible for high-speed gigapixel imaging in microscopy, a camera-array41-based WMFPM (CA-WMFPM) can be introduced to get around the aforementioned problems. As shown in Fig. 7, the system utilizes a multicolor (more than three colors) LED array to generate multiplexed illuminations with many different wavelengths. A diffraction grating is applied at the image plane in the detection path to separate the multiplexed information by different wavelengths. Then a camera array is employed to capture the separated data in real time. We use an LED array with 10 different wavelengths and a line-arranged illumination pattern as an example in Fig. 7. Both the multiplexing number and patterns are adjustable and related to the accessible narrowband LEDs in the market.

Fig. 7

The CA-WMFPM system. The schematic setup shown in the figure is the structure of the CA-WMFPM. The LED array has more than three different colors (10 in this system), and a special pattern (a line here for simplicity) is illuminated in each exposure, which represents a multiplexed scanning in the Fourier plane. A diffractive grating is used to separate the multiplexed information of different wavelengths in one direction, and a camera array together with a lens array is used to capture the raw data.

JBO_22_6_066006_f007.png

In addition to the much higher speed compared with the single-camera WMFPM, the CA-WMFPM system has several other benefits. The camera array uses gray cameras rather than the RGB one, so it will improve the photon efficiency and avoid the downsampling problem induced by the color filter. The contrast balance for different spectral channels can be adjusted in advance by setting different exposure times. There is also no need to do multiexposure measurements of the HDR process. Moreover, the chromatic defocus aberration can be corrected by setting the focal position of each camera independently.

The limitations of the CA-WMFPM are mainly the large size, expensive cost, and complex setup of the system. However, a camera-array-based system is still required for many high-throughput applications.32 To realize high-speed gigapixel imaging in microscopy, the most expensive and challenging problem is to design an optical setup (especially the objective lens) to afford the SBP of gigapixels. Our method gives a way to realize high-speed gigapixel imaging with a normal objective lens. Therefore, the CA-WMFPM system can greatly benefit applications requiring high-throughput observation, such as the research on cell division and drug discovery. Moreover, there is a trend in the image detector becoming smaller and cheaper, which may somewhat ease the above problem.

Furthermore, the design of an optimal pattern for multiwavelength illumination is a meaningful topic for future work with more details of numerical analysis and experimental verification. In addition, many works have applied FPM to thick samples for three-dimensional (3-D) refocusing42 and 3-D imaging.43 Our CA-WMFPM system is also compatible with these kinds of methods. In addition, we can further lose the assumption of the similar spectral response of the sample by replacing the multiwavelength LED array with a previous proposed wavelength-scanning scheme in a narrow spectral band,44 where a fiber-coupled tunable light source is used to realize 12 different illumination wavelengths between 480 and 513 nm.

4.3.

Comparison Among Existing Multiplexed Fourier Ptychographic Microscopy Methods and the Wavelength Multiplexed Fourier Ptychographic Microscopy strategy

Existing multiplexing techniques mainly work in Fourier domain. In these techniques,14,21,22 the intensity information corresponding to several different frequency areas is mixed in a single image. Several methods14,21 multiplex nonaliasing frequency information, which has no overlap in Fourier space, as shown in Fig. 8(a). The others22 multiplex aliasing frequency information, which has overlaps in Fourier space, as shown in Fig. 8(b). The ability of information multiplexing of these methods is limited. An extreme example is that we cannot realize the phase retrieval procedure with all the LEDs on simultaneously. In this case, there is no diversity to provide phase contrast. The multiplexing state commonly used is 8 times in the nonoverlapping case14,21 and twice in the overlapping case.22

Fig. 8

Illumination patterns of different multiplexed FP approaches. (a) (top) Random pattern multiplexed coded illumination,21 and four randomly chosen LEDs are turned on for each measurement; (bottom) Fourier coverage of the sample’s Fourier space for each LED pattern; (b) (top) state-multiplexed (or adjacent position multiplexed) illumination22 and (bottom) its Fourier coverage; (c) (top) line pattern illumination of the WMFPM strategy, and a line unit of adjacent three color (RGB) LEDs is turned on each time; (bottom) its Fourier coverage. The center unshaded circle (in its Fourier coverage) represents the NA of the objective lens.

JBO_22_6_066006_f008.png

Distinguished from existing methods working in this way, our WMFPM strategy labels different frequency information by separated narrowband spectral channels, as shown in Fig. 8(c). The wavelength multiplexing intrinsically bypasses the limitation of information multiplexing in Fourier domain. Due to the narrowband of the commercial LEDs in use, our RGB-MFPM system and the recovery algorithm are both easy to implement. The most attractive advantage of our method is its compatibility with most existing methods. Although the photo-sensitivity of the RGB sensor is lower than a gray camera in the prototype system, we can use the CA-WMFPM system or the Foveon X3 sensor-based WMFPM system to eliminate this problem in the future.

In addition, the WMFPM strategy is different from the spectral multiplexing technique.22 Though both use the LED array with multiple wavelengths to illuminate the sample from different incident angles, our method focuses on improving the acquisition speed by wavelength multiplexing. They try to use a gray camera (rather than an RGB camera or color filters) and their improved phase retrieval algorithm (the state-multiplexed FP algorithm) to achieve multispectral imaging.

5.

Conclusion

We accelerate the data acquisition process of FPM by introducing a wavelength multiplexing strategy. The proposed WMFPM strategy is compatible with most existing FPM methods. A proof-of-concept prototype, termed RGB-MFPM system, is built to validate our method. The experiments with a USAF-1951 resolution chart and kidney cells are provided with 3 times improvement of the acquisition speed and better reconstruction results than earlier studies. In the future, by implementing a CA-WMFPM system using multiwavelength LEDs, a diffraction grating, and a camera array, we will provide a way to realize high-speed gigapixel imaging in microscopy with a low-cost objective lens.

Disclosures

The authors have no relevant financial interests in the article and no other potential conflicts of interest to disclose. A small part of this work was previously published in the conference proceedings of Imaging and Applied Optics 2016.31

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Nos. 61327902 and 61120106003) and the Beijing Key Laboratory of Multidimension and Multiscale Computational Photography. G. Zheng was supported by the National Science Foundation Chemical, Bioengineering, Environmental, and Transport Systems (No. 1510077).

References

1. 

T. Das et al., “In vivo time-lapse imaging of cell divisions during neurogenesis in the developing zebrafish retina,” Neuron, 37 (4), 597 –609 (2003). http://dx.doi.org/10.1016/S0896-6273(03)00066-7 NERNET 0896-6273 Google Scholar

2. 

M. Wu et al., “Imaging hematopoietic precursor division in real time,” Cell Stem Cell, 1 (5), 541 –554 (2007). http://dx.doi.org/10.1016/j.stem.2007.08.009 Google Scholar

3. 

M. R. Costa et al., “Continuous live imaging of adult neural stem cell division and lineage progression in vitro,” Development, 138 (6), 1057 –1068 (2011). http://dx.doi.org/10.1242/dev.061663 Google Scholar

4. 

M. Yang et al., “Whole-body optical imaging of green fluorescent protein-expressing tumors and metastases,” Proc. Natl. Acad. Sci. U. S. A., 97 (3), 1206 –1211 (2000). http://dx.doi.org/10.1073/pnas.97.3.1206 Google Scholar

5. 

E. B. Voura et al., “Tracking metastatic tumor cell extravasation with quantum dot nanocrystals and fluorescence emission-scanning microscopy,” Nat. Med., 10 (9), 993 –998 (2004). http://dx.doi.org/10.1038/nm1096 1078-8956 Google Scholar

6. 

E. N. Savariar et al., “Real-time in vivo molecular detection of primary tumors and metastases with ratiometric activatable cell-penetrating peptides,” Cancer Res., 73 (2), 855 –864 (2013). http://dx.doi.org/10.1158/0008-5472.CAN-12-2969 Google Scholar

7. 

A. Bhattacharyya et al., “High-resolution imaging demonstrates dynein-based vesicular transport of activated Trk receptors,” J. Neurobiol., 51 (4), 302 –312 (2002). http://dx.doi.org/10.1002/(ISSN)1097-4695 JNEUBZ 0022-3034 Google Scholar

8. 

V. Westphal et al., “Video-rate far-field optical nanoscopy dissects synaptic vesicle movement,” Science, 320 (5873), 246 –249 (2008). http://dx.doi.org/10.1126/science.1154228 SCIEAS 0036-8075 Google Scholar

9. 

G. Schudta et al., “Live-cell imaging of Marburg virus-infected cells uncovers actin-dependent transport of nucleocapsids over long distances,” Proc. Natl. Acad. Sci. U. S. A., 110 (35), 14402 –14407 (2013). http://dx.doi.org/10.1073/pnas.1307681110 Google Scholar

10. 

G. Zheng et al., “The ePetri dish, an on-chip cell imaging platform based on subpixel perspective sweeping microscopy (SPSM),” Proc. Natl. Acad. Sci. U. S. A., 108 (41), 16889 –16894 (2011). http://dx.doi.org/10.1073/pnas.1110681108 Google Scholar

11. 

G. Zheng, R. Horstmeyer and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics, 7 (9), 739 –745 (2013). http://dx.doi.org/10.1038/nphoton.2013.187 NPAHBY 1749-4885 Google Scholar

12. 

A. Greenbaum et al., “Wide-field computational imaging of pathology slides using lens-free on-chip microscopy,” Sci. Transl. Med., 6 (267), 267ra175 (2014). http://dx.doi.org/10.1126/scitranslmed.3009850 STMCBQ 1946-6234 Google Scholar

13. 

W. Luo et al., “Synthetic aperture-based on-chip microscopy,” Light Sci. Appl., 4 (3), e261 (2015). http://dx.doi.org/10.1038/lsa.2015.34 Google Scholar

14. 

L. Tian et al., “Computational illumination for high-speed in vitro Fourier ptychographic microscopy,” Optica, 2 (10), 904 –911 (2015). http://dx.doi.org/10.1364/OPTICA.2.000904 Google Scholar

15. 

X. Ou et al., “Quantitative phase imaging via Fourier ptychographic microscopy,” Opt. Lett., 38 (22), 4845 –4848 (2013). http://dx.doi.org/10.1364/OL.38.004845 OPLEDP 0146-9592 Google Scholar

16. 

X. Ou, G. Zheng and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express, 22 (5), 4960 –4972 (2014). http://dx.doi.org/10.1364/OE.22.004960 OPEXFF 1094-4087 Google Scholar

17. 

S. Dong et al., “Sparsely sampled Fourier ptychography,” Opt. Express, 22 (5), 5455 –5464 (2014). http://dx.doi.org/10.1364/OE.22.005455 OPEXFF 1094-4087 Google Scholar

18. 

K. Guo et al., “Optimization of sampling pattern and the design of Fourier ptychographic illuminator,” Opt. Express, 23 (5), 6171 –6180 (2015). http://dx.doi.org/10.1364/OE.23.006171 OPEXFF 1094-4087 Google Scholar

19. 

L. Bian et al., “Content adaptive illumination for Fourier ptychography,” Opt. Lett., 39 (23), 6648 –6651 (2014). http://dx.doi.org/10.1364/OL.39.006648 OPLEDP 0146-9592 Google Scholar

20. 

Y. Zhang et al., “Self-learning based Fourier ptychographic microscopy,” Opt. Express, 23 (14), 18471 –18486 (2015). http://dx.doi.org/10.1364/OE.23.018471 OPEXFF 1094-4087 Google Scholar

21. 

L. Tian et al., “Multiplexed coded illumination for Fourier ptychography with an LED array microscope,” Biomed. Opt. Express, 5 (7), 2376 –2389 (2014). http://dx.doi.org/10.1364/BOE.5.002376 BOEICL 2156-7085 Google Scholar

22. 

S. Dong et al., “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express, 5 (6), 1757 –1767 (2014). http://dx.doi.org/10.1364/BOE.5.001757 BOEICL 2156-7085 Google Scholar

23. 

H. O. Bartelt, “Wavelength multiplexing for information transmission,” Opt. Commun., 27 (3), 365 –368 (1978). http://dx.doi.org/10.1016/0030-4018(78)90400-5 OPCOB8 0030-4018 Google Scholar

24. 

A. Banerjee et al., “Wavelength-division-multiplexed passive optical network (WDM-PON) technologies for broadband access: a review [invited],” J. Opt. Networking, 4 (11), 737 –758 (2005). http://dx.doi.org/10.1364/JON.4.000737 Google Scholar

25. 

P. J. Winzer, “Making spatial multiplexing a reality,” Nat. Photonics, 8 (5), 345 –348 (2014). http://dx.doi.org/10.1038/nphoton.2014.58 NPAHBY 1749-4885 Google Scholar

26. 

G. J. Tearney, M. Shishkov and B. E. Bouma, “Spectrally encoded miniature endoscopy,” Opt. Lett., 27 (6), 412 –414 (2002). http://dx.doi.org/10.1364/OL.27.000412 OPLEDP 0146-9592 Google Scholar

27. 

D. Yelin et al., “Three-dimensional miniature endoscopy,” Nature, 443 (7113), 765 (2006). http://dx.doi.org/10.1038/443765a Google Scholar

28. 

D. Yelin, B. E. Bouma and G. J. Tearney, “Volumetric sub-surface imaging using spectrally encoded endoscopy,” Opt. Express, 16 (3), 1748 –1757 (2008). http://dx.doi.org/10.1364/OE.16.001748 OPEXFF 1094-4087 Google Scholar

29. 

K. Goda, K. K. Tsia and B. Jalali, “Serial time-encoded amplified imaging for real-time observation of fast dynamic phenomena,” Nature, 458 (7242), 1145 –1149 (2009). http://dx.doi.org/10.1038/nature07980 Google Scholar

30. 

N. H. Matlis, A. Axley and W. P. Leemans, “Single-shot ultrafast tomographic imaging by spectral multiplexing,” Nat. Commun., 3 1111 (2012). http://dx.doi.org/10.1038/ncomms2120 NCAOBW 2041-1723 Google Scholar

31. 

Y. Zhou et al., “Wavelength multiplexed Fourier ptychographic microscopy,” in Computational Optical Sensing and Imaging, (2016). Google Scholar

32. 

D. J. Brady et al., “Multiscale gigapixel photography,” Nature, 486 (7403), 386 –389 (2012). http://dx.doi.org/10.1038/nature11150 Google Scholar

33. 

D. W. Noom, K. S. Eikema and S. Witte, “Lensless phase contrast microscopy based on multiwavelength Fresnel diffraction,” Opt. Lett., 39 (2), 193 –196 (2014). http://dx.doi.org/10.1364/OL.39.000193 OPLEDP 0146-9592 Google Scholar

34. 

D. W. Noom et al., “High-speed multi-wavelength Fresnel diffraction imaging,” Opt. Express, 22 (25), 30504 –30511 (2014). http://dx.doi.org/10.1364/OE.22.030504 OPEXFF 1094-4087 Google Scholar

35. 

M. Sanz et al., “Improved quantitative phase imaging in lensless microscopy by single-shot multi-wavelength illumination using a fast convergence algorithm,” Opt. Express, 23 (16), 21352 –21365 (2015). http://dx.doi.org/10.1364/OE.23.021352 OPEXFF 1094-4087 Google Scholar

36. 

D. Lee et al., “Color-coded LED microscopy for multi-contrast and quantitative phase-gradient imaging,” Biomed. Opt. Express, 6 (12), 4912 –4922 (2015). http://dx.doi.org/10.1364/BOE.6.004912 BOEICL 2156-7085 Google Scholar

37. 

L. Waller et al., “Phase from chromatic aberrations,” Opt. Express, 18 (22), 22817 –22825 (2010). http://dx.doi.org/10.1364/OE.18.022817 OPEXFF 1094-4087 Google Scholar

38. 

W. Lee et al., “Single-exposure quantitative phase imaging in color-coded LED microscopy,” Opt. Express, 25 (7), 8398 –8411 (2017). http://dx.doi.org/10.1364/OE.25.008398 OPEXFF 1094-4087 Google Scholar

39. 

P. M. Hubel, J. Liu and R. J. Guttosch, “Spatial frequency response of color image sensors: Bayer color filters and Foveon x3,” Proc. SPIE, 5301 402 (2004). http://dx.doi.org/10.1117/12.561568 PSISDG 0277-786X Google Scholar

40. 

J. Lukas, J. Fridrich and M. Goljan, “Determining digital image origin using sensor imperfections,” Proc. SPIE, 5685 249 (2005). http://dx.doi.org/10.1117/12.587105 PSISDG 0277-786X Google Scholar

41. 

X. Lin et al., “Camera array based light field microscopy,” Biomed. Opt. Express, 6 (9), 3179 –3189 (2015). http://dx.doi.org/10.1364/BOE.6.003179 BOEICL 2156-7085 Google Scholar

42. 

S. Dong et al., “Aperture-scanning Fourier ptychography for 3D refocusing and super-resolution macroscopic imaging,” Opt. Express, 22 (11), 13586 –13599 (2014). http://dx.doi.org/10.1364/OE.22.013586 OPEXFF 1094-4087 Google Scholar

43. 

L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica, 2 (2), 104 –111 (2015). http://dx.doi.org/10.1364/OPTICA.2.000104 Google Scholar

44. 

W. Luo et al., “Pixel super-resolution using wavelength scanning,” Light Sci. Appl., 5 (4), e16060 (2016). http://dx.doi.org/10.1038/lsa.2016.60 Google Scholar

Biography

You Zhou is a PhD candidate at the Department of Automation, Tsinghua University, Beijing, China. His research interests include Fourier ptychographic imaging and lensless imaging.

Qionghai Dai is currently a professor at the Department of Automation, Tsinghua University, Beijing, China. He obtained his PhD in 1996 from Northeastern University, Liaoning province, China. His research interests include microscopy imaging for life science, computational photography, computer vision, and 3-D video.

Biographies for the other authors are not available.

© 2017 Society of Photo-Optical Instrumentation Engineers (SPIE) 1083-3668/2017/$25.00 © 2017 SPIE
You Zhou, Jiamin Wu, Zichao Bian, Jinli Suo, Guoan Zheng, and Qionghai Dai "Fourier ptychographic microscopy using wavelength multiplexing," Journal of Biomedical Optics 22(6), 066006 (14 June 2017). https://doi.org/10.1117/1.JBO.22.6.066006
Received: 19 January 2017; Accepted: 22 May 2017; Published: 14 June 2017
Lens.org Logo
CITATIONS
Cited by 24 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Microscopy

Multiplexing

Image processing

Image acquisition

Back to Top