Open Access
1 May 2011 Real-time snapshot hyperspectral imaging endoscope
Author Affiliations +
Abstract
Hyperspectral imaging has tremendous potential to detect important molecular biomarkers of early cancer based on their unique spectral signatures. Several drawbacks have limited its use for in vivo screening applications: most notably the poor temporal and spatial resolution, high expense, and low optical throughput of existing hyperspectral imagers. We present the development of a new real-time hyperspectral endoscope (called the image mapping spectroscopy endoscope) based on an image mapping technique capable of addressing these challenges. The parallel high throughput nature of this technique enables the device to operate at frame rates of 5.2 frames per second while collecting a (x, y, λ) datacube of 350 × 350 × 48. We have successfully imaged tissue in vivo, resolving a vasculature pattern of the lower lip while simultaneously detecting oxy-hemoglobin.

1.

Introduction

Early diagnosis of cancer can significantly increase the chance of survival and improve quality of life post-treatment.1 However, most people are not detected with cancer until later stages (III or IV). Biophotonics-based in vivo diagnostics such as optical imaging offers the potential as an early cancer detection technology because measurements can be taken in real time and are noninvasive, high resolution, and inexpensive. To date, most cancer screening techniques rely on white light visual examinations using the human eye or a digital color camera. These techniques obtain large-scale morphologic and architectural details of the tissue but miss many of the more subtle hidden clues pertaining to early stage cancer development such as metabolic activity. To address this issue, researchers are now actively exploring the combination of spectroscopy with widefield imaging. Point spectroscopy studies have already shown that important endogenous early cancer bio-markers such as nicotinamide adenine dinucleotide plus hydrogen (NADH), flavin adenine dinucleotide (FAD), collagen, and oxy- and deoxy-hemoglobin have distinct fluorescence- and reflectance-based spectral signatures.2, 3, 4 The combination of these two complimentary techniques may improve current screening devices to more correctly identify suspicious tissue sites. Ongoing clinical studies have shown that multispectral based imaging approaches, particularly auto fluorescence imaging (AFI),5 narrow band imaging (NBI),6, 7 and tri-modal imaging,8 have improved early cancer detection specificities and sensitivities over the existing “gold standard” white light imaging method.

In light of this work, researchers are now actively pursuing real-time hyperspectral imaging devices that can provide 10 to 100 of spectral channels per image pixel.9, 10, 11 The hope is that the increase in spectral information will translate to even more accurate early diagnosis of cancer. However, specific drawbacks have limited their use as affordable real-time screening tools. The most common approaches are based on scanning in either the spectral or spatial domains. These serial acquisition systems can only collect a fraction of the full datacube at a single instant in time and therefore must tradeoff critical imaging parameters such as speed, image size, resolution, and/or signal-to-noise ratio. In addition, the hardware required for performing the precision scanning and subsequent reconstruction is often very expensive, limiting its accessibility for most clinical settings. For example, spectral scanning techniques that utilize liquid crystal or acousto-optic tunable filters have been used by various groups for acquiring increased spectral bandwidth,12, 13, 14, 15, 16 but can take >23 s for complete data collection depending on the datacube size and when combined with a sensitive camera system can cost >$30K.

Recently, several new snapshot hyperspectral imaging approaches have been developed, which have the potential to overcome the challenges of scanning-based techniques by acquiring the entire (x, y, λ) datacube in a snapshot. These techniques include aperture splitting,17 field sampling,18, 19 computed tomography imaging spectrometry,20 and coded aperture snapshot spectral imaging.21, 22 To date, only aperture splitting and field sampling techniques have been shown to acquire and display the (x, y, λ) datacube in real-time, which is one of the main requirements for in vivo imaging. However, these techniques have other limitations including reduced optical throughput and/or resolution. Therefore, while techniques that enable simultaneous acquisition of the (x, y, λ) datacube are desirable for real-time imaging, the reduction in performance of the other imaging parameters has limited their usefulness for many in vivo applications.

Our group recently developed a new type of snapshot hyperspectral imaging technique called image mapping spectroscopy (IMS) that avoids many of these same limitations.23, 24 It works by spatially distributing (i.e., mapping) neighboring image zones to isolated regions on a CCD camera [see Figs. 1a, 1b]. The mapping is accomplished through the use of an array of densely packed tiny mirror facets located at the field position. With this technique, there is no compromise of optical throughput or spatial resolution as the mirror facets are smaller than the incident point spread function and highly reflective. A prism [Fig. 1c] then spreads the spectral content from the mapped image into the surrounding void space on the CCD image sensor [see Fig. 1d]. In this way, we can avoid any spectral overlap resulting in unambiguous x, y, λ information on the image sensor. The IMS technique provides a very efficient and direct method for mapping each voxel (volume element) of the datacube to a single 2D pixel (picture element) of the image sensor. Simple image remapping is sufficient to reconstruct the original object in real-time. This paper presents, to the best of our knowledge for the first time, a real-time hyperspectral endoscope based on the snapshot IMS technology developed for microscopy. Adapting the IMS technology for real-time in vivo imaging required development in four main areas: 1. decreasing the overall system size for portability to the clinic and improved optical throughput, 2. increasing the spatial sampling of the IMS system for larger tissue region observation, 3. faster acquisition and display times for real-time (six frames per second) operation, and 4. development of a miniature optical probe for insertion into the instrument channel of a standard endoscope.

Fig. 1

Image mapping spectroscopy (IMS) concept.

056005_1_1.jpg

2.

Optical Design Concept

To achieve a more compact design with higher optical throughput and spatial sampling, a more efficient optical design than the previous IMS systems is required, which incorporates a large beam expander between the pupils and underutilizes the field-of-view of the collecting objective with a small image mapper. In previous designs, the beam expander was necessary for relaxing the design parameters of the system's components (primarily the image mapper); however, it was also responsible for reducing the throughput by over 50% and adding to the overall large system size of 24 in. × 24 in. × 12 in. (Ref. 25). For the endoscopy system, a new optimized design approach has been developed (shown in Fig. 2), which removes all unnecessary optical components (beam expanders) and maximizes the functionality of each component. The final design is comprised of seven primary components: 1. a miniature objective and coherent fiber optic bundle for insertion into the instrument channel of a standard endoscope, 2. a 33× image relay that transfers the image at the face of the fiber optic bundle onto the image mapper, 3. an image mapper that breaks apart the image into subimages containing selected mappings of the original image, 4. a collecting lens that captures the different subimages from the image mapper creating an array of stops at the back pupil plane of the collecting lens, 5. a row of prisms for dispersing these subpupils, and 6. an array of lenses that forms dispersed subimages onto 7. an image sensor. The illumination system (not shown) is a 100W halogen lamp with a fiber bundle.

Fig. 2

Image mapping spectrometer (IMS) endoscope optical design layout.

056005_1_2.jpg

Discussion of the system's operation begins at the tissue (distal) side where the illumination source provides light to an area of the tissue. Reflected, scattered, and/or fluorescent light from the tissue is collected and imaged through a miniature widefield objective onto the distal face of the coherent fiber optic bundle (Sumitomo P/N: IGN-08/30), which is then transferred to the proximal face. A doubly telecentric 33× image relay system composed of a 20× objective (Olympus P/N: UPLSAPO, f = 9 mm, NA = 0.75) and an achromatic doublet tube lens (Thorlabs P/N: AC508–300-A1, f = 300 mm, Dia. = 50.8 mm) that magnifies and re-images the proximal fiber bundle face onto the image mapper. The image mapper is composed of an array of tiny mirror facets that reflect linear mappings of the image to different regions in the stop of the collecting objective (Olympus P/N: MVPLAPO, f = 90 mm, NA ∼ 0.189), shown by different ray colors on the left hand side of Fig. 2, creating an array of stops. A row of prisms then disperse the light in each stop in an orthogonal direction to the length of the linear mappings. A re-imaging lens array (Edmund Optics P/N: 49278 f = 20 mm, Dia. = 5 mm) behind each stop forms subimages containing dispersed linear segments of the orignal image. A 16 megapixel interline CCD camera (Imperx P/N: IPX-16M3-L) is used to record the final dispersed and mapped image. The camera used the Kodak KAI-16000 detector, which has a format of 4872 × 3248 pixels or a 36.1 × 24 mm detector area. A simple software remapping program sorts the subimages to construct the complete (x, y, λ) datacube of the object (i.e., tissue region). Since there is no scanning and only trivial image processing, the system can acquire and display spectral images in real-time, limited only by the CCD camera readout speeds (full frame = 3 frames per second (fps), 2× binning = 6 fps).

The image mapper plays a key role in the hyperspectral imaging properties of the IMS endoscope. The total number of mirror facets (M), the number of resolvable image points along the length of each facet (N), and the number of different tilt angles (L), determine the volume and dimensions of the acquired (x, y, λ) datacube. N determines the total number of spatial data points in the x-dimension, given by the length of each facet divided by the width of the point-spread function at the mapper. M is the total number of spatial data points in the y-dimension, given by the number of distinct tilt angles (L) multiplied by the number of repeating blocks in the image mapper. Λ is the total number of spectral data points for (L) resolvable spectral bands (λ). For our system, Λ is equal to ∼2L, the Nyquist sampling rate. The resulting datacube size, T is then equal to T = MNΛ. In order to generate the required void regions to provide room for spectral dispersion, the image mapper is composed of repeated blocks of grouped facets with L tilt angles (as shown in Fig. 3), with L = 4 for clarity. In the actual IMS endoscope, there are L = 24 tilt angles in each block, six x-direction and four y-direction tilts. Each facet redirects a part of the image within that block to a unique location in the pupil, shown labeled in Fig. 3 with the same number as its corresponding tilt angle. The block is then repeated down the length (x-axis in Fig. 3) of the image mapper.

Fig. 3

Diagram showing the role of the image mapper in the system as it relates to the final datacube size. See text for detail.

056005_1_3.jpg

The design is also telecentric in the optical space around the image mapper, which is important for the mapping. The chief rays, reflected by specific facets in each mapped image, all have the same reflection angle. After passing through the collecting lens, light associated with these chief rays enters the pupil at a location corresponding to the specific facet from the image mapper. A prism then disperses the light from each facet in the orthogonal “or void” direction between neighboring facets (the prism is depicted as the blue wedge in Fig. 3). Each re-imaging lens is dedicated to only one tilt angle and images only the lines from that tilt angle. The image is thus efficiently redistributed for spectral separation without loss of light. The IMS endoscope is designed to achieve a (x, y, λ) datacube of 350 × 350 × 48 sampling; however, due to undersampling by the coherent fiber bundle, the system's effective sampling is limited to 200 × 200 × 48. This corresponds to 100-μm spatial resolution over a tissue area of 10 mm (closest conjugate plane). This resolution was chosen to resolve tissue vascularization, which is a critical feature for early cancer detection. Table 1 lists the system specifications.

Fig. 17

Datacube acquired using the IMS endoscope of the lower lip of a normal human volunteer. (a) Color composite image, (b) spectrum from vein and no-vein region, and (c) nine out of 45 spectral channel images. Datacubes were acquired at 143 ms using the full 12-bit dynamic range.

056005_1_17.jpg

Table 1

Design requirements for the real-time ISE system.

DescriptionRequirement
Spectral range450 to 650 nm
Spectral bands48
Image size200×200
Frame rate8 to 10 fps
Field of view10 mm
Spatial resolution100 μm
Spectral resolution4 to 10 nm

The spectral range of the IMS endoscope covers most of the visible spectrum (450 to 650 nm), which includes important spectral features from endogenous tissue fluorophores (NADH, FAD, Collagen, etc.), oxy- and deoxy-hemoglobin absorption and scattering. The spectral resolution for detecting these features was chosen to be between 4 and 10 nm to provide adequate resolution for distinguishing these tissue features. Note that due to the use of prisms, the spectral resolution varies across the spectral range with the highest resolution in the blue region. For the initial prototype system, 48 spectral bands were chosen for snapshot acquisition over a ∼96 nm spectral window corresponding to 4 nm resolution. This measurement window can be adjusted by inserting different bandpass filters into the system as long as it is within the designed spectral range. This is a unique advantage over other multispectral or other snapshot techniques allowing us to tune the system for specific contrast agents within the tissue.

3.

Prism and Lens Optical Design

The IMS endoscope uses off-the-shelf components for the objective lens, image relay, and collecting lens optics with propriety lens prescriptions. The exception to this is the image mapper and the prism/lens array. For the custom prism/lens array system, optical modeling software (ZEMAX) was used to achieve diffraction limited performance. Before modeling the system, three different dispersing approaches were explored: a diffraction grating, a single prism, and a double Amici prism. The double Amici prism design was chosen as the best approach as it could achieve the right range of dispersion angles while removing the central deviation angle, which is important for maintaining a common perpendicular image plane for multiple optical systems that use the same image detector. The disadvantage of the double Amici design is that the prisms are much thicker than either the diffraction grating or single prism, making it more difficult to keep the system compact. This is especially important as all the subimages must fit on the image sensor, which has dimensions of 24 × 36 mm. To address this issue, the prism was broken into six smaller prisms that could then be stacked on top of each other to cover the full array of stops in the collecting objective. The final design of the system is shown in Fig. 4.

Fig. 4

Prism/lens array design layouts and performance metrics.

056005_1_4.jpg

The prism is composed of 12 custom Amici prisms fabricated by Tower Optical, Inc. The prisms are placed back-to-back with a stop array between them as shown in Fig. 4d. The re-imaging lens array is composed of 24 achromatic doublets (Edmund optics P/N: 45–408) placed 4.376 mm behind the prism. A cross section profile in the y-z plane of the design is shown in Fig. 4a. The final subimage size is 5.33 mm in diameter with a fiber spacing of 27.3 μm and fiber core diameter of 16.4 μm. Each linear mapping on the CCD camera has dimensions of 16.7 μm × 5.33 mm. The linear mappings are sampled slightly above the Nyquist criteria with 7.4 × 7.4 μm pixels on the CCD. The individual stop diameters are 2.02 mm neglecting diffraction effects and are separated 5.6 mm apart. The width of the individual prisms (along the y-axis), diameter of the lens, and size of the final image were designed to be less than this stop spacing to prevent overlap and/or vignetting in the subimages. All subimages and their spacings reside within a 33.33 × 22.13 mm region, which is also within the CCD detector area. The prism and lens array are diffraction limited across the visible spectrum as shown in the modulation transfer function [Fig. 4c] for three field positions (0, −2.8 mm, +2.8 mm) and one (F-band) out of the three bands (F = 486.1, d = 587.6, and C = 656.3 nm) commonly used for visible color correction. Figure 4d and Table 2 provide a more detailed layout and accompanying lens prescription for the design. The input parameters used for the design had an entrance pupil size of 2 mm and an angular field-of-view of ±7.64° matching the expected stop array parameters created by the preceding collecting objective and 33× image relay.

Table 2

Lens prescription for prism/lens array. (* refers to tilted surfaces.)

SurfCommentRadiusThicknessGlassCAY tangent*
1*Amici Prism #1Infinity2.310F_SILICA2.840.577350
2*Infinity1.155N-SF573.60−0.577350
3Infinity0.1003.59
STOPInfinity0.1003.57
5Amici Prism #2Infinity1.155N-SF573.55
5*Infinity2.310F_SILICA3.550.577350
6*Infinity4.3742.80−0.577350
7Lens #175.5901.030SF104.12
88.6301.610BAFN104.36
9−14.11019.7122.30
IMAInfinity5.33

4.

Miniature Objective

For initial experiments, a miniature grin lens (Grintech P/N: GT-IFRL-100-010-50-NC) was used to image samples onto the fiber bundle. The grin lens has a focal length f = 0.92 mm, diameter d = 1.0 mm, and image side NA of 0.5 making it compatible for most endoscope tool channels. A ZEMAX optical model of the grin lens was provided by Grintech for evaluation of its performance. Figure 5a shows the optical layout for the lens at its designed conjugate distance of 10 mm corresponding to a field-of-view of 8.4 mm. At this field of view, the spacing between the fibers within the bundle provide 43 μm sampling of the object corresponding to a resolution of 86 μm based on the Nyquist criteria. The simulated performance of the lens indicates around 6% barrel distortion as shown in Fig. 5b. The errors in image mapping introduced by this barrel distortion can be corrected by our software remapping matrix described later in this paper. The predicted polychromatic modulation transfer function (MTF) plots [Fig. 5c] indicate significant reduction in the image contrast as the spatial frequencies increase. For comparison, the top curve of the plot is the diffraction limited MTF curve, while the system's MTF curve for the three commonly used field positions (on-axis, 0.707 field, and full field) are shown in the colored lines.

Fig. 5

IMS endoscope miniature grin lens: (a) design, (b) and (c) performance metrics, (d) prototype, and (e) imaging result.

056005_1_5.jpg

A prototype of the grin lens/coherent fiber bundle was constructed and used to image a 1951 USAF resolution target at its optimum working distance of 10 mm as shown in Figs. 5d, 5e. The illumination source was a tungsten halogen lamp (100 W) with a ground glass diffuser. The field of view (FOV) of the system was measured by comparing the known size of the bars in the image to the overall image size giving ∼7 mm, which is slightly less than the expected value of 8.4 mm. Position inaccuracy of the target with respect to the grin lens was probably the largest source of error in this measurement. At this conjugate position, the smallest resolvable line pair was in the group 4 element 1 bars [Fig. 5e] that correspond to 16 line pairs per millimeter or 62.5 μm resolution at the object plane. This is slightly better than the designed resolution of 86 mm, most likely due to the closer conjugate position. The ability to see these small bars is somewhat encouraging based on the large expected image aberrations from the grin lens. In the future, we plan on developing a custom diffraction limited miniature lens for detection of the small low contrast features within tissue that will be obscured by the lower quality grin lens.

5.

Image Mapper

By removing the beam expander between the pupils in the previous IMS designs, larger tilt angles are needed for the image mapper to achieve the necessary pupil spacing. This requires a new design approach for the image mapper since the previous monolithic design and fabrication approach could not achieve these tilts without exceeding the 300 micron depth of cut limitation of the diamond tools. To overcome this limitation, the image mapper was broken into 3 smaller segments of dimensions 24 × 8.67 mm length and width, respectively. Each segment is composed of 350 mirror facets. The three segments are bolted together to create a single image mapper that is 24 mm high and 26 mm long [Fig. 6a]. To reduce the possibility of alignment errors of the individual segments, they are fabricated at the same time using a diamond raster fly cutting process [Fig. 6b]. The fabrication method is described in more detail in our previous publication.25 For this application, the individual mirror facets within each segment are 70 μm wide and 8.67 mm long. This size was chosen to provide close to a Nyquist sampling rate of the intermediate image of the fiber bundle. The fiber bundle image has a diameter of 24 mm with a fiber spacing of 122.7 μm and fiber core size of 73.7 μm. In addition, the point spread function on the image mapper from the fiber bundle is 67 microns, which is also less than the size of the individual mirror facets. The image mapper is designed to be positioned at a 20° angle to the incident image, providing an angular separation of 40° between the incident and reflected paths; sufficient displacement for the large collecting lens to avoid vignetting the beam incident on the mapper. The reflected beam tilt angle is also small enough to keep the image mapper within the depth of field range of the incident image. The image mapper's axial position range due to the 20° tilt is ±4.44 mm with respect to the on-axis focal plane. The allowable depth of field is ±5.84 mm based on the geometric minimum blur diameter being less than or equal to the size of an individual mirror facet (70 μm). The four x-tilts for the image mapper facets are ±0.015 and ±0.045 radians while the six y-tilts are ±0.015, ±0.045, and ±0.075 radians. There are 24 unique combinations of these tilts that are used to create a 4 × 6 stop array in the collecting objective. As mentioned earlier, the arrangement of these facet tilts are repeated down the height of the image mapper covering the entire surface area. A picture of a finished segment of the image mapper is shown in Fig. 6c next to a U.S. quarter for size comparison.

Fig. 6

Image mapper (a) design, (b) fabrication process, and (c) prototype segment.

056005_1_6.jpg

6.

Prototype

A prototype of the IMS endoscope has been assembled as shown in Fig. 7. The system resides on a portable optical breadboard of dimensions equal to 12 in × 24 in. The green line in Fig. 7a traces the path of the incoming light from the grin lens to the image mapper. A reference CCD camera is incorporated in the system using a 92:8 beam splitter. A filter wheel between the 33× image relay lenses has been added to the system to select different spectral windows for taking measurements.

Fig. 7

Assembled IMS endoscope system prototype (a) full system and (b) with cover. Close up images of (c) image mapper, (d) lens array, and (e) rows of Amici prisms.

056005_1_7.jpg

The multicolored paths after the image mapper show the possible (y-axis) paths the light may travel after reflection depending on the facet tilt geometry. Figure 7b shows the prototype with the cover on for blocking ambient light. The system can operate in normal lighting conditions. Figure 7c displays a close up of the three segment image mapper comprising 1050 mirror facets. The three segments are identical to each other creating three ramp functions across the face of the image mapper. Figures 7d, 7e display close ups of the rows of prisms and lens array assembly.

7.

Testing and Calibration

7.1.

Spectral Calibration

The spectral range and resolution for the IMS endoscope was calibrated using a liquid crystal tunable filter (LCTF, Cambridge Research Instruments) placed in front of a broadband 100 W halogen lamp. Narrowband light from this setup was projected onto a ground glass diffuser to create uniform illumination in the field-of-view of the IMS system. At every ∼2 nm spectral step, a raw image was recorded on the CCD camera and the relative change in the image position (number of pixels moved) was recorded. The spectral position of the LCTF was independently verified using an Ocean Optics Spectrometer (P/N: USB4000) during the experiment. Figure 8 shows the results of this experiment compared to the designed dispersion curve of the system. Note that the IMS system is capable of simultaneously recording any 48 consecutive spectral channels (i.e., pixels) within the spectral range, and the selection of channels can be easily adjusted by inserting different bandpass filters into the system.

Fig. 8

Comparison of predicted and measured wavelength to pixel shift for the IMS endoscope system.

056005_1_8.jpg

7.2.

Mapping Algorithm

The same system setup was used to create the mapping matrices that transform the 2D raw data into a (x, y, λ) datacube as previously described in Ref. 24. Briefly, monochromatic images were recorded on the CCD camera [see Fig. 9a] and bright pixels were thresholded. One-pixel wide lines were fitted to these thresholded pixels, which represents light from a single 70-μm wide mirror facet of the image mapper [see Fig. 9b]. By knowing the periodic pattern of the image mapper facet angles, fitted lines in the raw data can be reshuffled into a single monochromatic image as shown in Fig. 9c; the result of this process is a lookup table that provides raw data coordinates for every voxel in the reconstructed datacube. This procedure was repeated using monochromatic images throughout the spectral range of 450 to 720 nm.

Fig. 9

Demonstration of how the raw data (a) is remapped one line (b) at a time to form a single monochromatic image (c) within the simultaneously acquired (x, y, λ) datacube. This process is repeated for each wavelength within the datacube.

056005_1_9.jpg

To account for small mapping errors between the subimages in the system, a Ronchi ruling was used to vertically correct image lines from each mirror facet. A reconstructed image of the Ronchi ruling was recorded and an edge-detection program was run to determine the misalignment of each image line in terms of pixel shift. The mapping matrix was then updated to include this offset. Figure 10 demonstrates this procedure for an actual image of the Ronchi ruling. The yellow arrows in Fig. 10 indicate the direction of the pixel shifts.

Fig. 10

Depiction of the procedure for correcting small mapping errors in the IMS system. (a) Close up of six subimages in the IMS system. (b) The incorrect remapped image of a Ronchi ruling. (c) The corrected remapped image of a Ronchi ruling.

056005_1_10.jpg

Once the correct mapping matrices were determined for each wavelength, a lookup table containing indexes into the raw data was created for the entire datacube. Simple indexing and reshaping operations allow the 2D raw data from the CCD to be transformed in real-time into a datacube. Using the custom LabVIEW program, the datacube can be calibrated for irradiance (by imaging a target of known reflectance with a broadband source), background signals can be subtracted, signals can be averaged or binned, and the spectral band can be shifted all simultaneously, with the speed only limited by the frame rate of the camera.

The power of this technique is that the full (x, y, λ) datacube can be unambiguously recorded at a single instant in time. Figure 11 shows this principle in practice. Figures 11a, 11b show a raw image of a monochromatic spatially uniform source with a close up of one of the 24 subimages. Figures 11c, 11d show the polychromatic version (527 to 587 nm), in which all 48 spectral images are being simultaneously acquired. This image was taken of a white sheet of lens paper; the horizontal stripes in the images are due to sampling of the fiber bundle pattern, as shown in Fig. 11e. The illumination was provided by a 100 W halogen lamp. A bandpass filter (Chroma D557/60 M) was used to provide a spectral window of about 60 nm centered at 557 nm. Notice that there is no overlap of light between the separate dispersed lines. Image artifacts apparent in the column of submages (on the far right side of Fig. 11) are due to a fabrication error in the image mapper that will be corrected in the future.

Fig. 11

Raw monochromatic image from the IMS endoscope showing (a) all 24 subimages and (b) a close up of a single subimage. Polychromatic image with (c) all 24 subimages and (d) close up of a single polychromatic subimage. (e) A close up of a single line within a monochromatic subimage shows the individual fibers within the multifiber bundle.

056005_1_11.jpg

To verify correct spatial remapping, spatial resolution, and intensity calibration, images of a 1951 USAF resolution target were taken with the endoscope portion removed from the system. Figure 12a shows the raw data acquired from the CCD camera. Figures 12b, 12c, 12d, 12e are four representative spectral images out of the 48 simultaneously acquired and displayed in real-time with the IMS endoscope. The spectral channel wavelengths for these images are 584, 567, 549, and 532 nm respectively. The acquisition time was 5.2 fps, which is the maximum readout rate for the camera with vertical binning. Using the LCTF and other test targets, spectral calibration was verified by comparing the measured spectra of pixels throughout the data cube with the readout of an ocean optics spectrometer.

Fig. 12

(a) Raw image from the IMS of a USAF resolution target simultaneously acquired. (b)–(e) four out of 48 remapped images for the wavelengths (584, 567, 549, and 532 nm).

056005_1_12.jpg

8.

Biological Images

To evaluate the biological imaging performance of the IMS endoscope, the system was used to image the vasculature of the lower lip of a normal volunteer. The lip was illuminated by a 100 W halogen lamp through a fiber optic bundle. The irradiance level at the tissue was measured to be 41.8 mW/cm2. An intensity calibration of the source was performed prior to tissue imaging by imaging a white piece of paper. The fiber optic tip of the IMS endoscope was placed at ∼10 mm from the tissue site. Figure 13 shows the experimental setup for imaging.

Fig. 13

Experimental setup for imaging the tissue vascularization of the lower lip of a normal volunteer.

056005_1_13.jpg

A green filter (Chroma D557/60 M) with a spectral window from 527 to 587 nm was placed into the IMS system for this experiment. The size of the datacube (x, y, λ) acquired with this filter is 350 × 350 × 29. The limitation on the spectral samples is due to the bandwidth of the filter (only 60 nm). To fully utilize the spectral range of the system at this region would require a custom filter, which was not available at the time. The IMS endoscope acquired datacubes at a speed of 5.2 fps corresponding to an integration time of 192 ms. The dynamic range of the datacubes was 12-bit at a gain setting of 6 dB. The custom LabVIEW software remapped and displayed the datacubes at a rate of 5.2 fps. Figure 14a shows one of the 29 spectral images (wavelength 546 nm) within a single acquired datacube.

Fig. 14

Lower lip vasculature imaging results from a normal volunteer using the IMS endoscope. (a) One of the 29 spectral images acquired (546 nm band). (b) Reference image taken with the color CCD camera. (c) Spectral curves from an area in the image where there is a vein (solid line) and no vein (dashed line).

056005_1_14.jpg

As one can see, the image quality is slightly reduced due to some intensity artifacts (periodic lines through the image) caused by fabrication defects in the image mapper. Ongoing research is being carried out to reduce or eliminate these effects for future systems. However, important tissue architectural features, such as vasculature patterns, can still be observed in the images and have similar contrast as those acquired using the reference color CCD camera (Luminera Infinity II). Figure 14c shows spectral curves from two regions within the datacube. The solid line is taken from a region in the datacube where there is a vein [left red box in Fig. 14a] and the dashed line is taken from another region in the datacube where there is no vein [see right red box in Fig. 14a]. A dominating feature within these spectral curves corresponds to absorption peaks of oxy-hemoglobin that occurs at 542 and 576 nm.

The IMS endoscope system was then assembled in a more robust enclosure for field/clinical testing as shown in Fig. 15. The spectral range was increased for further functionality. The fiber optic imaging end was inserted into the instrument channel of a standard Pentax upper GI endoscope as shown in Fig. 15a. Illumination at the tissue site was provided by the Pentax light guides that have a broadband Mercury lamp as its light source.

Fig. 15

(a) A more robust IMS endoscope for use in a clinical setting. (b) Miniature imaging end of the IMS endoscope at the end of the Pentax endoscope. (c) Fiber optics of the IMS endoscope inserted into the instrument channel of the Pentax system.

056005_1_15.jpg

The variation in the irradiance profile from the light guides to the tissue site were measured by illuminating a spectralon 99% reflectance standard (Labsphere) and measuring the cross section profile from an image acquired using the internal camera from the Pentax endoscope as shown in Fig. 16a. The length of the cross section was set to 7 mm matching the diameter of the FOV of the IMS endoscope's grin lens. The irradiance variation was found to be <1%. For reflectance measurements, the datacubes acquired by the IMS endoscope are normalized against the reflectance standard by averaging 10 consecutively acquired images.

Fig. 16

Images of the IMS endoscope taken by the Pentax endoscope's camera: (a) 99% reflectance standard and (b) lower lip of a normal human volunteer.

056005_1_16.jpg

After normalization of the datacubes, the fiber optic imaging tip of the IMS endoscope was placed near the lower lip of a normal human volunteer as shown in Fig. 16b.

A typical datacube taken with the IMS endoscope of the human volunteer's lower lip is shown in Fig. 17. The top left image is a color composite image constructed from the 45 spectral channel images. Two spectral curves are shown to the right of this image indicating the measured spectrum from a region with a vein and one without a vein. A subset of nine spectral channel images out of the acquired 45 are shown below. The datacube was acquired, remapped, and displayed at the top speed of the camera at integration times of 143 ms with full 12-bit dynamic range. The brightness of the light source from the pentax endoscope was set to a medium setting.

9.

Discussion and Conclusion

To the best of our knowledge, we have demonstrated, for the first time, a new endoscopic imaging device that is capable of collecting spectral and spatial information in a single snapshot. These snapshots can be acquired, remapped, and displayed at 5.2 fps providing real-time video. This device overcomes many limitations of other scanning or snapshot techniques with its high sensitivity, fast acquisition and display speed, and high spatial/spectral image contrast and resolution making detection of low signals in dynamic environments possible. Further biological testing of the system is now required to determine its limits for distinguishing early cancer biomarkers in both reflectance and fluorescence modes. For fluorescence detection of biomarkers such as FAD and NADH, some hardware development is still required on the illumination path; in particular, a custom filter assembly for the tissue end of the endoscope. However, most of the future work will be focused on software/algorithm development to process and extract as much useful information from the tissue site as possible. We are encouraged by the initial results and believe that the IMS endoscope will provide researchers/clinicians with a valuable tool for better understanding and/or identifying tissue diseases, like cancer, very early in their development process. We are currently exploring combining the IMS endoscope with a microscopic detection technique for screening and diagnosing cancer within a single procedure. Although, the work toward this dual mode system is still very early on.

Acknowledgments

The authors would like to thank NIH for their funding support of this project through Grant No. R01 CA124319 entitled “Integrated Bi-FOV endoscope for detection of Precancer” and Grant No. R21EB009186 entitled “Image Slicing Spectrometer (ISS) for high resolution sub-cellular microscopy.”

References

2. 

R. Drezek, C. Brookner, I. Pavlova, I. Boiko, A. Malpica, R. Lotan, M. Follen, and R. Richards-Kortum, “Autofluorescence microscopy of fresh cervical-tissue sections reveals alterations in tissue biochemistry with dysplasia,” Photochem. Photobiol., 73 636 –641 (2001). https://doi.org/10.1562/0031-8655(2001)0730636AMOFCT2.0.CO2 Google Scholar

3. 

Y. Wu, W. Zheng, and J. Y. Qu, “Time-resolved confocal fluorescence spectroscopy reveals the structure and metabolic state of epithelial tissue,” Proc. SPIE, 6430 643013 (2007). https://doi.org/10.1117/12.702450 Google Scholar

4. 

G. Zonios, L. T. Perelman, V. Backman, R. Manoharan, M. Fitzmaurice, J. Van Dam, and M. S. Feld, “Diffuse reflectance spectroscopy of human adenomatous colon polyps in vivo,” Appl. Opt., 38 6628 –6637 (1999). https://doi.org/10.1364/AO.38.006628 Google Scholar

5. 

P. M. Lane, T. Gilhuly, P. Whitehead, H. Zeng, C. F. Poh, S. Ng, P. M. Williams, L. Zhang, M. P. Rosin, and C. E. MacAulay, “Simple device for the direct visualization of oral-cavity tissue fluorescence,” J. Biomed. Opt., 11 024006 (2006). https://doi.org/10.1117/1.2193157 Google Scholar

6. 

P. Sharma, A. Bansal, S. Mathur, S. Wani, R. Cherian, D. McGregor, A. Higbee, S. Hall, and A. Weston, “The utility of a novel narrow band imaging endoscopy system in patients with Barrett's esophagus,” Gastrointest. Endosc., 64 167 –175 (2006). https://doi.org/10.1016/j.gie.2005.10.044 Google Scholar

7. 

M. A. Kara, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. J. G. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett's esophagus,” Gastrointest. Endosc., 64 176 –185 (2006). https://doi.org/10.1016/j.gie.2005.11.050 Google Scholar

8. 

W. L. Curvers, R. Singh, L. M. Wong-Kee Song, H. C. Wolfsen, K. Ragunath, K. Wang, M. B. Wallace, P. Fockens, and J. J. G. H. M. Bergman, “Endoscopic tri-modal imaging for detection of early neoplasia in Barrett's oesophagus: a multi-centre feasibility study using high resolution endoscopy, autofluorescence imaging, and narrow band imaging,” Gut, 57 167 –172 (2008). https://doi.org/10.1136/gut.2007.134213 Google Scholar

9. 

H. Akbari, K. Uto, Y. Kosugi, K. Kojima, and N. Tanaka, “Cancer detection using infrared hyperspectral imaging,” Cancer Sci., 1349 –7006 (2011). https://doi.org/10.1111/j.1349-7006.2011.01849.x Google Scholar

10. 

G. M. Palmer, A. N. Fontanella, G. Zhang, G. Hanna, C. L. Fraser, and M. W. Dewhirst, “Optical imaging of tumor hypoxia dynamics,” J. Biomed. Opt., 15 066021 (2010). https://doi.org/10.1117/1.3523363 Google Scholar

11. 

A. M. Siddiqi, H. Li, F. Faruque, W. Williams, K. Lai, M. Hughson, S. Bigler, J. Beach, and W. Johnson, “Use of hyperspectral imaging to distinguish normal, precancerous, and cancerous cells,” Cancer, 114 (1), 13 –21 (2008). https://doi.org/10.1002/cncr.23286 Google Scholar

12. 

T. Vo-Dinh, “A hyperspectral imaging system for in vivo optical diagnostics,” IEEE Eng. Med. Biol. Magn., 23 40 –49 (2004). https://doi.org/10.1109/MEMB.2004.1360407 Google Scholar

13. 

S. C. Gebhart, R. C. Thompson, and A. Mahadevan-Jansen, “Liquid-crystal tunable filter spectral imaging for brain tumor demarcation,” Appl. Opt., 46 1896 –1910 (2007). https://doi.org/10.1364/AO.46.001896 Google Scholar

14. 

M. E. Martin, M. B. Wabuyele, K. Chen, P. Kasili, M. Panjehpour, M. Phan, B. Overholt, G. Cunningham, D. Wilson, R. C. Denovo, and T. Vo-Dinh, “Development of an advanced hyperspectral imaging (HSI) system with applications for cancer detection,” Ann. Biomed. Eng., 34 1061 –1068 (2006). https://doi.org/10.1007/s10439-006-9121-9 Google Scholar

15. 

D. Roblyer, C. Kurachi, A. M. Gillenwater, and R. Richards-Kortum, “In vivo fluorescence hyperspectral imaging of oral neoplasia,” Proc. SPIE, 7169 71690J (2009). https://doi.org/10.1117/12.807226 Google Scholar

16. 

Z. Pan, G. Healey, M. Prasad, and B. Tromberg, “Hyperspectral face recognition under variable outdoor illumination,” Proc. SPIE, 5425 520 –529 (2004). https://doi.org/10.1117/12.543102 Google Scholar

17. 

S. A. Mathews, “Design and fabrication of a low-cost, multispectral imaging system,” Appl. Opt., 47 71 –76 (2008). https://doi.org/10.1364/AO.47.000F71 Google Scholar

18. 

H. Matsuoka, Y. Kosai, M. Saito, N. Takeyama, and H. Suto, “Single-cell viability assessment with a novel spectro-imaging system,” J. Biotechnol., 94 299 –308 (2002). https://doi.org/10.1016/S0168-1656(01)00431-X Google Scholar

19. 

A. Bodkin, A. I. Sheinis, A. Norton, “Hyperspectral imaging systems,” (2006). Google Scholar

20. 

B. K. Ford, C. E. Volin, S. M. Murphy, R. M. Lynch, and M. R. Descour, “Computed tomography-based spectral imaging for fluorescence microscopy,” Biophys. J., 80 (2), 986 –993 (2001). https://doi.org/10.1016/S0006-3495(01)76077-8 Google Scholar

21. 

M. E. Gehm, R. John, D. J. Brady, R. M. Willett, and T. J. Schulz, “Single-shot compressive spectral imaging with a dual-disperser architecture,” Opt. Express, 15 14013 –14027 (2007). https://doi.org/10.1364/OE.15.014013 Google Scholar

22. 

A. A. Wagadarikar, N. P. Pitsianis, X. Sun, and D. J. Brady, “Video rate spectral imaging using a coded aperture snapshot spectral imager,” Opt. Express, 17 6368 –6388 (2009). https://doi.org/10.1364/OE.17.006368 Google Scholar

23. 

L. Gao, R. T. Kester, and T. S. Tkaczyk, “Compact Image Slicing Spectrometer (ISS) for hyperspectral fluorescence microscopy,” Opt. Express, 17 12293 –12308 (2009). https://doi.org/10.1364/OE.17.012293 Google Scholar

24. 

L. Gao, R. T. Kester, N. Hagen, and T. S. Tkaczyk, “Snapshot Image Mapping Spectrometer (IMS) with high sampling density for hyperspectral microscopy,” Opt. Express, 18 14330 –14344 (2010). https://doi.org/10.1364/OE.18.014330 Google Scholar

25. 

R. T. Kester, L. Gao, and T. S. Tkaczyk, “Development of image mapping field units for hyperspectral biomedical imaging applications,” Appl. Opt., 49 1886 –1899 (2010). https://doi.org/10.1364/AO.49.001886 Google Scholar
©(2011) Society of Photo-Optical Instrumentation Engineers (SPIE)
Robert T. Kester, Noah Bedard, Liang S. Gao, and Tomasz S. Tkaczyk "Real-time snapshot hyperspectral imaging endoscope," Journal of Biomedical Optics 16(5), 056005 (1 May 2011). https://doi.org/10.1117/1.3574756
Published: 1 May 2011
Lens.org Logo
CITATIONS
Cited by 151 scholarly publications and 42 patents.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Endoscopes

Hyperspectral imaging

Prisms

Image segmentation

Tissues

Cancer

Tissue optics

RELATED CONTENT


Back to Top