The Future Looks Photonic
Advanced data mining, 3-D video imaging techniques and image fusion are bringing to fruition superaccurate real-time diagnostics, brain mapping and futuristic driverless cars.
Developments in photonics have changed the landscape of imaging in communications, defense, security, health care, transportation and many other applications in a deep and lasting way over the past few years. Imaging depends on the ability to obtain information and display it, which at some point in the process always involves photons. What’s trending is the way researchers are gathering imaging data and how they are using it.
Imaging is making great leaps in biomedical research and health care. Publications abound with novel research that aims to improve disease detection and diagnosis using imaging techniques. Some of these health care and diagnostic imaging advances have to do with novel analysis tools and techniques.
At the Howard Hughes Medical Institute, researchers have developed a major first in neuroscience imaging: creating cellular-resolution video of neurons firing in a brain. Dr. Philipp Keller and Dr. Misha Ahrens of the institute’s Janelia Farm Research Campus in Ashburn, Va., used light-sheet microscopy to map nearly all the neuron activity in the brain of a zebra fish, in vivo, in about 1 s.1 The method uses a laser to illuminate a live sample and a commercially available camera to capture the images. An important enabling technology is the scientific complementary metal oxide semiconductor (sCMOS) camera, the Orca-Flash4.0 made by Hamamatsu Corp. of Bridgewater, N.J., that enabled a 30-ms frame rate at 0.65-µm resolution.2
(a) Neurons light up in this image of a zebra fish larva brain, measuring 800 × 600 × 200 μm. Dr. Misha Ahrens and Dr. Philipp Keller at HHMI/Janelia Farm Labs recorded planar images in 5-μm steps with a 4.25 ±0.80-μm-thick light sheet. (b) They used a Hamamatsu Orca-Flash4.0 camera to capture 30 images per second, enabling video playback of firing neurons. Photo courtesy of Philipp Keller and Matt Staley, Janelia Farm.
The larval zebra fish brain measures a few hundred microns across and has approximately 100,000 neurons. While that’s a fraction of the size and complexity of a human brain, it’s an exciting development nonetheless.
“When I saw these movies for the first time, it left me in awe, realizing that we had here a live view of a thinking brain,” said Nico Stuurman, research associate at the University of California, San Francisco, and Howard Hughes Medical Institute.
(Left) A video photoacoustic technique developed by Wenfeng Xia and colleagues at the University of Twente rotates in 3-D to reveal a top and side view of a phantom object made of gels and other materials that imitate breast tissue. The background mimics normal breast tissue, while several objects embedded within the material mimic blood vessels and tumors. (Right) Two reconstructed image slices of imitation tissue taken with the new device reveal the location of five objects indicated with arrows: Objects 1 and 2 mimic blood vessels, while objects 3 to 5 mimic tumors. Photo courtesy of W. Xia, University of Twente.
3-D breast mapping
A prototype imaging tool called a photoacoustic mammoscope may someday replace traditional x-rays with safer near-IR light to make a 3-D map of the breast, say researchers at the University of Twente in the Netherlands.3 Photoacoustic mammography works via optical absorption contrast imaging of vasculature – cancer tissue has more blood vessels than healthy tissue – but is typically limited by 2-D imaging and a small field of view. Professor Wiendelt Steenbergen and colleagues combined Q-switched lasers (alexandrite and Nd:YAG) to produce near-IR light and used an eight-element linear array ultrasound detector to create a 3-D image with a bigger field of view.
Photoacoustic breast tomography developed by the Biomedical Photonic Imaging group at the University of Twente involves combining a highly sensitive linear-array ultrasound detector with a near-IR imaging system. Photo courtesy of W. Xia, University of Twente.
The group tested the prototype using imitation human tissue made of gels and other materials to make successful detection of cancerlike features. Although the resolution of the device doesn’t yet match that of x-ray mammography and MRIs, the researchers hope to improve the resolution and incorporate several wavelengths at once to increase detection. If commercialized, the instrument also would have the added bonus of being cost-competitive with ultrasound; it may cost less than x-ray and MRI mammography.
The next step is proving the efficacy of the tool in larger clinical trials next year.
Other researchers are pursuing better breast-imaging technology by optimizing how the imaging data is processed and analyzed. Computer-aided diagnosis (CADx) and quantitative image analysis (QIA) potentially may enable customized identification of specific tumor signatures, said Maryellen Giger, A.N. Pritzker Professor of Radiology/Medical Physics and director of the Imaging Research Institute at the University of Chicago. Data-driven analytics being developed in her lab provide a “second opinion” for diagnoses based on x-ray, MRI or ultrasound breast images.4
Professor Maryellen Giger’s lab at the University of Chicago has used quantitative image analysis to assess breast lesions observed in MRIs. The software automatically extracts volumetrics, morphology, texture and kinetics from the lesions to estimate the probability of malignancy. Photo courtesy of M. Giger, University of Chicago/SPIE.
But the ultimate goal is to expand the role of QIA/CADx beyond screening to diagnosis. By comparing a tumor with a database of other clinical cases, Giger expects, radiologists will be able to assess the type of tumor and give a more accurate prognosis. Researchers are using data mining to relate the attributes of various cancer types to potentially understanding how lesion characteristics apply to disease states. Analyses of large quantities of data could enable customized patient-centric treatment. The analysis can also be extended to studying the association between a tumor’s observable characteristics and cell-level data, which could help identify genes that affect disease risk. Validation of these methods, too, will require large clinical trials.
Many recent imaging advances involve using existing technology in new ways. Mainstream media has published a lot about the advent of the driverless car, such as the Google car. Although a fully automated one is not yet available to consumers, self-driving cars can now be spotted in tests on the roadways in Florida, California, Nevada and the UK. Nearly every major auto manufacturer offers or is planning to offer vehicles with “driver assist” capability that uses sensing and imaging technology to steer, brake and accelerate without driver input.
The Google driverless car uses a $70,000 lidar system consisting of 64 laser beams to generate a 3-D map of the environment surrounding the car. A fleet of about 12 autonomous cars had completed more than 500,000 accident-free driving miles by July 2013. Photo
courtesy of Google.
Most automakers have also announced projects to design driverless or near-driverless cars, including companies such as GM, Lexus, Audi, Tesla and Nissan. Mercedes-Benz and BMW have premiered production-ready automated vehicles. Google expects to release its autonomous car technology in 2018, but skeptics claim commercial availability estimates may be exaggerated and that a “highly advanced driver assist system” is more realistic.
Whether or not governments are prepared in the next 10 years for the legality of driverless or near-driverless cars, the technology that enables it is here. Google’s driverless Prius uses lidar and dozens of lasers to create a 3-D map of the vehicle’s environment in real time. A current fleet of a dozen cars is on the road, but, unfortunately, economies of scale haven’t improved the price; Google revealed the cars have $150,000 worth of equipment.
Senso Optics of Yokneam, Israel, announced in November the commercial availability of the first thermal camera on a chip for advanced driving assistance systems (ADAS). The ADAS technology warns drivers of lane departures and obstacles even in foggy conditions, and applies braking and steering assistance as needed.
The thermal ADAS imager for vehicles enables clearer detection of certain roadway details. The dye used in the dashed white line is almost invisible at visible wavelengths but appears brighter in the near-IR (red circles). Photo courtesy of Senso Optics.
The Senso Optics ADAS emphasizes reliability and a lower false-alarm rate in detecting live figures such as pedestrians and animals. The company hopes to make the system affordable enough for mainstream commercial viability.
“Senso Optics’ thermal imaging for night-vision experience and our unique set of image processing and enhancement algorithms [put] us in an excellent position to develop reasonably priced products with excellent performance in resolution and sensitivity,” said president and CEO Jacob Dagan.
The company is developing an ADAS with a video sensor that creates a single fused image from two images. The system would combine the images of far- and near-IR sensors into a single functional near-IR image. Smart algorithms would enable analysis of real-time situations to further reduce false alarms.
Senso Optics also recently revealed a high-end uncooled thermal camera – a thermal vision system for enhanced vision systems designed to improve situational awareness and provide safer taxiing, takeoffs and landings for commercial aircraft pilots.
Toyota Motor North America CEO Jim Lentz thinks his competitors’ estimates of commercial availability of autonomous vehicles may be overhyped. When asked on CNBC’s Squawk Box program in September when driverless cars will arrive, he declared, “Not in my lifetime.”
Even if George Jetson cars won’t be ubiquitous by, say, 2040, we can see where the technology is headed. Whether via improved modularity, miniaturization and sustainability; new materials; or new applications for legacy technology, imaging will be a key part of the trends transforming the way we relate and live.
1. M.B. Ahrens et al (May 2013). Whole-brain functional imaging at cellular resolution using light-sheet microscopy. Nat Meth, Vol. 10, pp. 413-420.
3. W. Xia et al (Nov. 1, 2013). Design and evaluation of a laboratory prototype system for 3D photoacoustic full breast tomography. Biomed Opt Expr, Vol. 4, pp. 2555-2569.
4. M. Giger (Oct. 14, 2013). Quantitative breast image analysis for personalized medicine. SPIE Newsroom.
- Referring to the bandwidth and spectrum location of the signal produced by television or radar scanning.
MORE FROM PHOTONICS MEDIA