Search
Menu
Cognex Corp. - Smart Sensor 3-24 GIF LB

Smartphones Move from Social Media to Social Medicine

Facebook X LinkedIn Email
The power of light microscopy, in your palm

Melinda A. Rose, [email protected]

Cellphones have evolved from a luxury to a necessity in the past 20 years, and now smartphones are making those same inroads (as of September 2012, 45 percent of American adults owned a smartphone, according to research by the Pew Internet Project). With the added versatility and “brain power” supplied by these enhanced phones, as well as their superior imaging capabilities and bright, efficient light sources, their use in the field of medicine, perhaps more than any other application, has the power to improve our day-to-day existence.

A number of research groups throughout the world are developing, or have already developed, ways for mobile phones to be used as microscopy tools, something particularly useful in remote locations without access to such diagnostic systems.


Mobile phone microscopy layout schematic, prototype and sample images of the CellScope created at UC Berkeley. (a) Mobile phone microscopy optical layout for fluorescence imaging. The same apparatus was used for bright-field imaging, with the filters and LED removed. Components required only for fluorescence imaging are indicated. Not to scale. (b) A current prototype, with filters and LED installed, capable of fluorescence imaging. The objective is not visible because it is contained within the optical tubing, and the sample is mounted adjacent to the metallic focusing knob. (c) Bright-field image of 6-µm fluorescent beads. (d) Fluorescent images of beads shown in (c). The field of view projected onto the camera phone CMOS is outlined. Scale bars are 10 µm. Courtesy of “Mobile Phone Based Clinical Microscopy for Global Health Applications,” PLOS ONE, doi: 10.1371/journal.pone.0006320.g001.


Here are just a few of the up-and-coming applications of smartphones in diagnostics that promise to improve life:

The Camera Culture research group at MIT’s Media Lab has created NETRA (Near-Eye Tool for Refractive Assessment), a quick, simple and inexpensive way to determine your own glasses prescription, without the need for a trained professional. The device measures refractive errors of the eye – something that affects as many as 2 billion people worldwide, according to the World Health Organization – such as nearsightedness, farsightedness, astigmatism and age-related vision loss. If left untreated, the errors are the world’s second-highest cause of blindness.

The small plastic NETRA device forgoes the need for a Shack-Hartmann sensor to shine a laser into the eye of the patient and measure the reflected light with a wavefront sensor. It can be produced for less than $2 and clips onto a mobile phone screen. The subject uses the phone to signal when patterns projected onto the screen overlap. After the process is repeated several times at different angles for each eye, custom software loaded onto the phone crunches the data and creates a prescription, all within a few minutes.

Bioengineers at the University of California, Berkeley, have fitted a smartphone with magnifying optics to create a real “cell” phone – a diagnostic-quality microscope that can be used in developing countries.

The researchers, in the bioengineering and biophysics lab of Daniel Fletcher, initially envisioned a device so rugged that it could be used for high-resolution imaging outside of the lab. But Dr. Eva M. Schmid, who works in the lab, decided to evaluate the device, called CellScope, in a middle-school science classroom after a chance encounter with a secondary school science teacher in San Francisco.


The CellScope, created in the lab of bioengineering professor Daniel Fletcher, turns the camera of a standard cellphone into a diagnostic-quality microscope with a magnification of 5x to 60x.


Those middle schoolers helped develop the educational side of the device, using it for a year to take macroscopic and microscopic pictures of objects in their homes, gardens and other environments and then displaying them on the screen and posting them to social media platforms to promote discussion. The devices are now being tested with other classrooms and in museums.

Schmid described CellScope’s development at the annual meeting of the American Society for Cell Biology in December in San Francisco.

The commercial potential of the CellScope attracted a $1 million investment last summer from Khosla Ventures, the venture capital firm led by Sun Microsystems Inc. co-founder Vinod Khosla.

“Health data, the key ingredient to useful analysis and diagnosis, is starting to explode exponentially – and CellScope is on the cutting edge,” Khosla said.

CellScope’s first consumer offering will be a smartphone-enabled otoscope that enables physicians to remotely diagnose ear infections in children from pictures taken by parents using the smartphone’s camera. Pediatric ear infections result in 30 million doctor visits annually in the US alone.

Future CellScope products will address throat and skin exams and nonclinical applications, including consumer skin care.

Dining out with severe food allergies can be nerve-racking, relying on a busy server or kitchen to make sure you’re not served something that could make you sick or, even worse, deathly ill. Even prepackaged foods can contain ingredients not listed on the label. Now, a team led by UCLA associate professor of electrical engineering and bioengineering Aydogan Ozcan wants to give control back to those with allergies by allowing them to test their meals on the spot using a cellphone.

The lightweight (less than 2 oz) device, called the iTube, uses the phone’s camera, in combination with an application, to test with the same high level of sensitivity as a lab would, Ozcan said.


Aydogan Ozcan and colleagues at UCLA have developed the iTube platform (Left), which attaches to a cellphone and uses colorimetric assays and a digital reader to detect allergens in food samples. (Right) A screen capture of the iTube App.


The device tests for allergens by optically measuring a sample of the food in question mixed with water and an extraction solvent, then mixing the prepared solvent with a series of other reactive testing liquids.

The method digitally converts raw images from the cellphone camera into concentration measurements detected in the food samples. The test goes beyond just a “yes” or “no” answer to the presence of allergens by quantifying how much of an allergen is in a sample, in parts per million.


Left:
Ankit Mohan of MIT Media Lab’s Camera Culture research group holds NETRA, the Near-Eye Tool for Refractive Assessment, a quick, simple and inexpensive way to measure refractive errors of the eye. Right: Mohan demonstrates the device.


The iTube platform can test for a variety of trigger foods, including peanuts, almonds, eggs, gluten and hazelnuts, Ozcan said.

“We envision that this cellphone-based allergen testing platform could be very valuable, especially for parents, as well as for schools, restaurants and other public settings,” Ozcan said. “Once successfully deployed in these settings, the big amount of data – as a function of both location and time – that this platform will continuously generate would indeed be priceless for consumers, food manufacturers, policy makers and researchers, among others.”

The device was introduced in 2012, and, so far, the UCLA researchers have successfully tested the iTube using commercially available cookies, analyzing the samples to determine if they have any harmful amounts of peanuts. Their research was recently published online in Lab on a Chip and will be featured in an upcoming issue of the journal.

In 2008, Ozcan’s lab introduced the imaging platform LUCAS (Lensless Ultrawide-field Cell monitoring Array platform based on Shadow imaging). Instead of using a lens to magnify objects, LUCAS generates holographic images of microparticles or cells by using an LED to illuminate the objects and a digital sensor array to capture their images. The technology can be used to image blood samples or other fluids in Third World countries, such as monitoring the condition of HIV and malaria patients, as well as testing water quality.

“This technology will not only have great impact in health care applications; it also has the potential to replace cytometers in research labs at a fraction of the cost,” Ozcan said in a 2008 university release announcing the technology.

Since then, he has built upon the LUCAS technology to develop a lensless microscope the size of a chicken egg for telemedicine applications.

KeepLoop Oy of Tampere, Finland, a spinoff of VTT Technical Research Center of Finland, last summer introduced what it said was the first digital mobile 3-D microscope. The prototype, capable of measuring surface microtopography, was demonstrated at the drupa 2012 exhibition in May in Düsseldorf, Germany.

Hamamatsu Corp. - Earth Innovations MR 2/24


The KeepLoop 3-D digital mobile microscope measures surface topography over an area of around 5 x 5 mm at approximately 10-µm resolution.


The portable system measures surface topography over an area of around 5 x 5 mm at about 10-µm resolution. Using the photometric stereo machine vision technique, three images with different illumination are combined, and 3-D data is calculated by a special algorithm.

The system is based on a custom-made optoelectronics device that attaches to the master device and forms the 3-D microscope together with the proprietary software. The technology can be implemented in mobile phones or tablets.

Improvements in microchip technology and new scientific advances enable scientists at the University of Texas at Dallas to tap into the terahertz range, a result that could lead to cellphones that can be used to see through walls, wood, plastics, paper and other opaque materials, they announced in the spring of 2012.

“We’ve created approaches that open a previously untapped portion of the electromagnetic spectrum for consumer use and lifesaving medical applications,” said UT Dallas electrical engineering professor Dr. Kenneth O. “The terahertz range is full of unlimited potential that could benefit us all.”

Using the new approach, images can be created with signals operating in the terahertz range without having to use several lenses inside a device, reducing overall size and cost.

To create their device, O and his team combined terahertz and CMOS microchips used in smartphones, personal computers and other consumer electronic devices.

“CMOS is affordable and can be used to make lots of chips,” O said. “The combination of CMOS and terahertz means you could put this chip and a transmitter on the back of a cellphone, turning it into a device carried in your pocket that can see through objects.” (Due to privacy concerns, they are focused on uses at a distance of less than 4 in.)

Consumer applications of such technology, they say, could range from document authentication and counterfeit detection to process control in manufacturing.

The research was presented at the 2012 International Solid-State Circuits Conference in February. The team will work next to build an entire working imaging system based on the CMOS terahertz system.

A sugar-cube-sized spectrometer developed at Fraunhofer Institute for Photonic Microsystems (IPMS) in Dresden, Germany, can be installed into smartphones and used in grocery stores to determine the ripeness of a piece of fruit or the tenderness of a slab of meat.

The application is based on a near-infrared spectrometer that measures the amount of water, sugar, starch, fat and protein present in products by shining a broad-bandwidth light onto the item in question and “looking” several centimeters below the surface. The item will reflect various wavelengths of light in the near-infrared range with different intensities, resulting in a spectrum that tells scientists what amounts of which substances are present in foodstuff. For instance, it can detect whether the core of an apple is already rotting.

“We expect spectrometers to develop in the same way that digital cameras did,” said Dr. Heinrich Grüger, who manages the business unit of Fraunhofer IPMS, where the system is being developed. “A camera that cost €500 ten years ago is far less capable than the ones you get virtually for free today in your cellphone.”

Conventional spectrometers are manufactured by assembling individual components: The mirrors, optical gaps, grating and detector each have to be put into place individually and properly aligned.

The Fraunhofer researchers instead manufacture the individual optical gaps and gratings directly onto silicon wafers, which are large enough to hold the components of several hundred spectrometers, enabling hundreds of near-infrared systems to be created at one time. They stack the wafers containing the integrated components on top of the ones bearing the optical components, then align and bind them, isolating them to form individual spectrometers. The resulting devices are more robust than their handmade counterparts.

Aside from inspecting food, the researchers are confident that the spectrometer could be used for forgery detection or to test the contents of cosmetic creams and drugs.

A prototype of the spectrometer was on display in May at Sensor + Test 2012 in Nuremberg. The scientists are working on a corresponding infrastructure for the device and say it could be ready for market launch within the next three to five years.

In November, Virginia Tech’s Arlington Innovation Center: Health Research announced that it had been awarded a $2.2 million contract from the US Army Medical Research and Materiel Command to develop the MedicPhone, a field smartphone for military medics.

The three-year research and development venture has two partners, including Seoul National University of South Korea for Android expertise, and Starix Technology of Irvine, Calif., for ultrawideband communications interfaces between the phone and field medical devices. The project expects to attract additional partners in the near future.

“We’ll be starting with state-of-the-art off-the-shelf components because there is just so much tech in the mobile phone market now, there’s no reason to duplicate that effort,” said Kenneth Wong, the principal investigator of MedicPhone Project.

The product Wong describes will likely be slightly larger and more robust than a cellphone and will enable military medics to monitor patients without the equipment usually associated with an intensive-care unit. The idea is to create a handheld device that can collect information on a patient’s condition from a variety of sources and provide medical teams with access to this data in a single, portable display. This feature will allow medics to monitor a patient from another location – an important consideration when medics are charged with multiple patients in austere or even hazardous environments.

“Medics carry a lot of equipment already, so anything we can do to reduce weight and size is good,” Wong said. “The idea of a phone is a useful platform – there is a lot of computer power in a small space, and it’s a familiar technology, so when people pick this up, they’ll be able to use it quickly.”

The team envisions that the phone will evolve into a hands-free mobile communication hub for military medics as well as civilian emergency responders.

A new kind of polymer that is cheap and easy to make reflects many wavelengths of light when viewed from a single perspective and could form the basis of handheld multispectral imaging devices, say researchers at the University at Buffalo in New York.

Engineers at the university developed a one-step method to fabricate the polymer, which will make it feasible to develop small devices that connect with cellphones to conduct multispectral imaging, said Qiaoqiang Gan, assistant professor of electrical engineering.


A rainbow-colored grating, about 25 mm wide, under sunlight. Enlarged microscope images show the graded surface, with the black bars indicating a length of 10 µm. The polymer could form the basis of handheld multispectral imaging devices that connect to cellphones to perform multispectral imaging.


“Our method is pretty low-cost, and because of this and the potential cellphone applications, we feel there is a huge market for improving clinical imaging in developing countries,” Gan said.

The single-step fabrication method – shining a laser light through a curved lens – is affordable and relatively simple.

The filter is about 25 mm long, but the technique the researchers used is scalable: It’s possible to create filters of different sizes by shining the laser through lenses of different sizes.

The researchers now are working to improve the quality of the rainbow filter and are beginning to explore ideas for incorporating the technology into handheld devices.



The mainstreaming of mHealth, or mobile health:

mHealth apps are forecast to reach 142 million downloads globally by 2016, according to Juniper Research. Here are a few more statistics, compiled by Float Mobile Learning:

• There are more than 10,000 medical and health care apps available in Apple’s iTunes Store.

• 80 percent of doctors use smartphones and medical apps.

• 56 percent of doctors who use mobile devices say they expedite decision making.

• 40 percent of doctors believe that mobile health technologies can reduce the number of office visits, which totaled 1 billion in 2011 in the US alone.

• 78 percent of US consumers are interested in mobile health solutions.

• 88 percent of doctors would like their patients to monitor their health at home, particularly their weight, blood sugar and vital signs.

Published: January 2013
CommunicationsdefenseFeaturesindustrialMicroscopySensors & Detectorsspectroscopy

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.