Search
Menu
Deposition Sciences Inc. - Difficult Coatings - LB - 8/23

Multispectral Quantitative Phase Imaging Captures Live Human Cells Quickly

Facebook X LinkedIn Email
Dr. Eric V. Chandler and David E. Fish, Pixelteq Inc., Dr. Laura Waller, University of California, Berkeley

Within biological systems, structural, chemical and functional information about a system of interest can be extracted from the wealth of spectral data available. This article focuses on the specific application of multispectral imaging to quantitative phase imaging – specifically, the known chromatic aberration from sequential images obtained from a high-speed multispectral wheel camera yield a quantitative phase, facilitating cell tracking and screening.

Silicon image sensors collect light across a broader spectral range than simply the visible spectrum. The typical sensitivity range is from 350 nm (UV) to 1100 nm (NIR). A monochrome sensor will form an image from the total integrated intensity across all wavelengths. Combined with a NIR filter, common RGB color cameras employ a pixel-scale dye filtration to restrict the light collected by the camera to three wide bands in the visible.

Countless applications require a greater degree of spectral selectivity than that afforded by RGB filtration. By placing a thin-film interference filter in front of a monochrome sensor, a well-defined band of the spectrum – controlled by the filter properties – reaches the sensor. Multispectral imaging is achieved by acquiring a series of images as sequential filters pass in front of the sensor. The spectral range of interest is divided into a series of discrete bands (Figure 1).


Figure 1.
Multispectral imaging parses the spectral range of interest into a finite number of discrete bands. Courtesy of Pixelteq.


The sequential acquisition of spectral images limits this type of multispectral imaging to quasi-stationary applications, where subject movement within a series is minimal. Historically, pixel shift between different spectral bands has enforced a select-and-shoot method: The selected filter is moved into position in front of the sensor and stops while an image is acquired.

Recent advances enable the continuous rotation of the filter wheel in front of the sensor. With uninterrupted filter rotation, the optimum balance between frame rate and exposure time is achieved. Frame rates are now limited by the read-out rate of the selected camera. Pixel shift is limited only by the properties of the lens, allowing direct comparison of images from different spectral channels without the added computational burden of edge-matching or resizing algorithms. Additionally, the full resolution of the sensor is maintained. At faster frame rates, multispectral imaging increasingly is viable in the biomedical arena, where maintaining the full sensor resolution at each wave band is critical.

Quantitative phase imaging for biology

Observing features in minimally scattering media poses a significant challenge for biomedical research. Phase imaging exploits optical-path length (OPL) variations caused by local differences in refractive index or thickness to enhance these otherwise difficult-to-detect features. Phase-contrast imaging is a popular method, as it does not rely on exogenous contrast mechanisms that potentially can alter the function of the system or introduce toxicity, such as fluorescent tagging or dye staining. Conversely, phase contrast does not cause photobleaching; nor does it perturb natural cellular function. In addition, it is not susceptible to misinterpretation by nonuniform taggant bonding in fluorescence microscopy.1

Applications such as monitoring neural cell growth2 or temporal changes of red blood cell membranes,3 large-population cell tracking,4 and cell volume and dry mass calculations in clinical cancer studies5,6 are all sensitive to external contrast mechanisms. Each specific effort, along with countless others, benefits from advances in nonperturbative techniques such as phase imaging.

Traditional phase-contrast imaging on most commercial microscopes requires the addition of a phase-contrast objective. Differential interference contrast (DIC) is a popular method for cellular detection, requiring additional polarization optics. DIC is sensitive to OPL changes and so is excellent at resolving small variations in refractive index, especially near cell edges. However, DIC is not quantitative. “Phase wrapping” can occur, wherein an OPL change of one-half the wavelength appears identical to an optical OPL change of three-halfs the wavelength. Further, amplitude information (absorption) cannot be separated from phase delays. For dense samples, this can be observed as a series of bright and dark fringes, confusing cell-tracking software.

In contrast, quantitative phase imaging recovers information about both absorption and phase, but separately. Morphological information can be gleaned from the quantitative phase. This is particularly useful in systems that do not exhibit substantive contrast in either bright-field or conventional phase-contrast imagery, such as differentiation between white and gray neuronal tissue.7

Many methods exist for quantitative phase imaging. A particularly convenient approach is to recover phase from a series of intensity images taken through focus (Figure 2). A sequence of images is taken as either the sample moves through the focal plane, or the camera scans through the image plane. Intensity information from the stack is translated into a phase via the transport of intensity equation (TIE).8-10 The TIE relates the flow of intensity along the optical axis to the local phase gradient.11 Pixel-by-pixel phase is calculated from the intensity difference between two or more depth-offset images.


Figure 2.
Phase is computed from a through-focus stack of monochrome images. (Left) A piezoelectric stage provides the defocused images. (Right) A phase delay map of human cheek cells is extracted using the transport of intensity equation.12 Courtesy of Dr. Laura Waller.


The TIE has a high degree of phase accuracy, limited only by constituent image noise, and can reach phase sensitivity on the order of nanometers. However, existing implementations still rely on mechanical movement of either the sample or camera along the optical axis. For many of the applications described above, such perturbations would obscure the desired results.

Translating color to phase

A recently developed method extracts phase from a series of multispectral images without mechanical movement of either the sample or imager.13 This exploits a perceived hindrance in transmissive imaging systems: the chromatic aberration of the lens. Chromatic aberration is caused by the differences in glass refractive index as a function of wavelength. For lenses, this results in each wavelength experiencing different OPLs and focusing to a slightly different depth (Figure 3). Pairing different glass types – as with achromatic lenses – helps mitigate the offset over finite wavelength ranges, but cannot remove the problem entirely.

Meadowlark Optics - Building system MR 7/23


Figure 3.
Chromatic aberration – due to a wavelength-dependent refractive index of focusing optics – results in a color-dependent focus shift. Longer wavelengths shift to increased focal lengths. Photo courtesy of Pixelteq.


Since OPL is dependent upon wavelength, knowledge of the wavelength-dependent focusing properties of the lens allows the TIE to be remapped into the spectral domain. Chromatic aberration now acts as a phase-contrast mechanism. This is demonstrated in Figure 4, where a quantitative phase image generated from a through-focus image series is compared with a quantitative phase image obtained from an RGB color camera. Live human dermal microvascular endothelial cells in EGM-2MV growth medium were imaged under a Nikon Eclipse TE2000-U microscope with a 20× (0.4-NA) achromatic objective. The resultant phase is nearly identical between methods, indicating a colorless object, so a multispectral acquisition scheme can substantially simplify quantitative phase imaging.

Multispectral quantitative phase imaging

Narrowing the spectral bands with thin-film dielectric filters improves the accuracy of the calculated OPL difference over standard RGB filtration and therefore improves the phase calculations. Further, additional spectral images lead to improved phase recovery through application of the higher-order TIE algorithm.14 The SpectroCam from Pixelteq, which uses this technology, enables single-shot acquisition of eight spectral channels at diffraction-limited lateral spatial resolution and couples directly to a commercial microscope without additional hardware. Narrow, interchangeable bandpass filters allow discrete sampling of various OPLs, and surface profiles can be retrieved with ~10-nm accuracy.


Figure 4.
(Top) Three through-focus monochromatic images were used to generate the quantitative phase image at right of live human dermal microvascular endothelial cells via the transport of intensity equation. (Bottom) The same cell cluster imaged with an RGB camera, wherein separate color channels were used to calculate the phase via a modified transport of intensity equation. Scale units are in normalized optical-path length (OPL). TIE = transport of intensity equation. Courtesy of Dr. Laura Waller.


Quantitative phase images of human cheek cells taken with the RGB camera and with the SpectroCam are contrasted in Figure 5 (using the algorithm in Reference 14). The edges and contents of the cell appear with improved sharpness when using additional color channels. Additionally, fewer phase artifacts are apparent in the background – particularly on the left edge of the image, where the RGB camera artifacts begin to obfuscate the cell wall.

Future extensibility

The speed of the multispectral quantitative phase imaging technique is limited only by the camera’s frame and data-transfer rates. Biological systems that change relatively quickly could be imaged at lower resolution or with fewer colors, or both, to improve quantitative phase imaging rates.


Figure 5.
Quantitative phase images of human cheek cells taken with (left) an RGB color camera and (right) an eight-channel multispectral camera. Phase artifacts are reduced, and image sharpness improves with additional spectral channels. Courtesy of Dr. Laura Waller.


Certain potential applications with more restrictive requirements would prohibit a wheel-based solution: low power draw, light weight or passive filtering. Once a wave band set and spatial resolution are determined for a specific application, it is possible to enable higher-frame-rate acquisition with a passive multispectral imaging platform. Custom patterned dielectric filter mosaics – similar to the Bayer pattern – can be applied to the surface of a monochrome sensor. The repeating filter pattern translates monochrome sensors into multispectral sensors, though at decreased spatial resolution.

Multispectral quantitative phase imaging holds promise for expanding characterization efforts in bioscience applications, ranging from cell classification and sorting, to monitoring the effect of new medicines on cellular growth. By avoiding the use of tags or specialized illumination sources, this technique can find widespread utility in delicate organic systems.

Meet the authors

Dr. Eric V. Chandler is an electro-optical scientist at Pixelteq Inc. in Golden, Colo.; email: [email protected]. Dr. Laura Waller is an assistant professor in the Department of Electrical Engineering and Computer Sciences at University of California, Berkeley; email: [email protected]. David E. Fish is vice president of technology at Pixelteq Inc. in Golden, Colo.; email: [email protected].

References

1. G. Popescu (2011). Quantitative Phase Imaging of Cells and Tissues. McGraw-Hill Biophotonics. McGraw-Hill.

2. Z. Wang et al (2011). Spatial light interference microscopy (SLIM). Opt Express, Vol. 19, pp. 1016-1026.

3. T. Ikeda et al (May 15, 2005). Hilbert phase microscopy for investigating fast dynamics in transparent systems. Opt Lett, Vol. 30, pp. 1165-1167.

4. E. Meijering et al (February 2012). Methods for cell and particle tracking. Methods Enzymol, Vol. 504, pp. 183-200.

5. G. Popescu et al (August 2008). Optical imaging of cell mass and growth dynamics. Am J Physiol Cell Physiol, Vol. 295, pp. C538-C544.

6. R. Barer (March 1, 1952). Interference microscopy and mass determination. Nature, Vol. 169, pp. 366-367.

7. C. Yang et al (Oct. 15, 2000). Interferometric phase-dispersion microscopy. Opt Lett, Vol. 25, pp. 1526-1528.

8. E. Barone-Nugent et al (June 2002). Quantitative phase-amplitude microscopy I: optical microscopy. J Microsc, Vol. 206, pp. 194-203.

9. M. Teague (Nov. 1, 1983). Deterministic phase retrieval – A Green’s function solution. J Opt Soc Am, Vol. 73, pp. 1434-1441.

10. N. Streibl (1984). Phase imaging by the transport equation of intensity. Opt Commun, Vol. 49, pp. 6-10.

11. L. Waller et al (Jan. 31, 2011). Phase and amplitude imaging from noisy images by Kalman filtering. Opt Express, Vol. 19, pp. 2805-2814.

12. J. Zhong et al (May 2014). Non-uniform sampling and Gaussian process regression in Transport of Intensity phase imaging. International Conference on Acoustics, Speech, and Signal Processing (ICASSP), paper 1569853295, Florence, Italy.

13. L. Waller et al (Oct. 25, 2010). Phase from chromatic aberrations. Opt Express, Vol. 18, pp. 22817-22825.

14. L. Waller et al (June 7, 2010). Transport of Intensity phase-amplitude imaging with higher-order intensity derivatives. Opt Express, Vol. 18, pp. 12552-12561.

Published: July 2014
Glossary
chromatic aberration
Chromatic aberration is an optical phenomenon that occurs when different wavelengths (colors) of light are refracted by a lens or optical system, leading to a failure to focus all colors to the same convergence point. This results in colored fringes or halos around the edges of objects, reducing the overall image quality and sharpness. Key points about chromatic aberration include: Cause: Chromatic aberration arises because different colors of light have different refractive indices when...
multispectral imaging
Multispectral imaging is a technique that involves capturing and analyzing images at multiple discrete spectral bands within the electromagnetic spectrum. Unlike hyperspectral imaging, which acquires data across a continuous range of wavelengths, multispectral imaging is characterized by capturing information at several specific, predefined bands. This allows for the extraction of spectral signatures and information from different parts of the spectrum. Key aspects of multispectral imaging...
spectrum
See optical spectrum; visible spectrum.
visible
That term pertaining to the spectral region that can be perceived by the eye.
AmericasbiologicalBiophotonicscamerascell trackingchromatic aberrationFeaturesFiltersimage sensorImaginglensesLight SourcesMaterialsMicroscopymonochrome sensormultispectral imagingNIROpticsPixelTeqQuantitative ImagingRGBsilicon sensorsSpectral Rangespectrumthin-film interferenceUniversity of California Berkeleyvisiblemultispectral quantitative phase imagingwaveband scatteringoptical-path lengthOPL

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.