Caren B. Les, firstname.lastname@example.org
ASHBURN, Va. – Optical distortion has been a problem for astronomers observing the cosmos through ground-based telescopes and for researchers studying the substructures of living cells through microscopes.
Astronomers have successfully used adaptive optics methods to minimize the distortion of light – such as the twinkling of stars – as it passes through the atmosphere and into their telescopes. More recently, researchers at the Janelia Farm Research Campus of Howard Hughes Medical Institute have implemented adaptive optics to counter problems of optical aberration or blurring when viewing biological specimens under a microscope.
Before joining Janelia Farm, Dr. Eric Betzig, a scientist involved in the project, had spent most of his career working on “superresolution” microscopy methods, including near-field scanning optical microscopy and photoactivatable localization microscopy, both of which provide image resolution beyond what is allowed by the diffraction limit. The techniques are applicable to single cells and to ultrathin sections of tissues.
“Talking to our neurobiologist colleagues, who are interested in studying the behavior of neurons in intact brain but are limited to superficial depths, we realized that the inhomogeneous optical properties of biological tissues make it impossible for conventional imaging approaches such as confocal and two-photon microscopy to reach even the diffraction limit,” said Na Ji, a Janelia Farm researcher. She added that these inhomogeneous properties result in loss of both signal and resolution, precluding the use of the new superresolution methods.
“If we can develop a general method of correcting these sample-induced optical aberrations, we could improve the power of many different imaging methods,” she said.
In explaining the basic elements of their adaptive optics technique, Ji said that the only optical element they used that is not in a conventional microscope is a spatial-light modulator, which consists of nearly 2 million liquid crystal pixels, each independently adjustable in phase. The modulator allows the scientists to measure and correct the optical aberration. (In comparison, in astronomy, the wavefront sensor and wavefront corrector are usually separate elements.)
“When measuring the aberration, the modulator allows us to direct light to only a small region of the objective back pupil. We take an image of the sample using only this small illuminated region,” Ji said. She explained that optical aberrations associated with the region can be inferred from the shift of the image relative to a reference image acquired with a fully illuminated back pupil. The process is then repeated with all other regions in the back pupil and, once the optical aberration has been determined across the entire back pupil, the modulator is used to introduce the appropriate corrective wavefront to the system.
“The main challenge is that there is no direct way – except in specific situations such as retinal imaging – to measure the aberrated wavefront. In a telescope, a wavefront sensor is used to measure the distorted wavefront after the aberrating media (atmosphere). In microscopy, it is usually not possible to measure the distorted wavefront within the specimen,” she said.
Ji said that the researchers are applying the technique to two-photon in vivo brain imaging of mice to see whether they can extend the imaging depth. They also are extending the technique to other imaging modalities, such as wide-field microscopy, she added.
Results from fixed mouse cortical slices illustrated the researchers’ ability to improve signal and resolution to depths of 400 µm, according to their report on the investigation, which appeared in the Dec. 27, 2009, online issue of Nature Methods.