Large-Area Color Sensor Array Has Vertical Structure
The future of optical sensors may be looking up as researchers investigate vertically integrated color imaging devices. A team from Forschungszentrum Jülich in Germany and Palo Alto Research Center in California has integrated amorphous silicon readout electronics with a vertically integrated sensor to make a large-area, thin-film, three-color sensor array. The technique could be used for such applications as lab-on-a-chip optical readout.
Today’s sensors place red, green and blue filters over each sensing element to capture color. Three locations are needed for each pixel, thereby limiting the resolution. Vertically integrated sensor structures are seen as a potential solution.
The vertically integrated color sensor was constructed on a glass substrate (top). The sensor produced a 185 × 260-pixel image without using color filters on the array (right). Courtesy of Palo Alto Research Center. ©2006, American Institute of Physics.
“As it is getting more difficult to put smaller color filter arrays on a sensor chip, people will sooner or later move to vertically integrated sensors,” said Dietmar Knipp, who was a member of the research team at Palo Alto and now is an assistant professor of electrical engineering at International University Bremen in Germany.
Santa Clara, Calif.-based Foveon Inc. produces a sensor in which a CMOS process is used to vertically integrate three diodes (see “Direct Image Sensor Tackles Color Concerns,” Photonics Spectra, November 2003, page 99). The principle of operation for that image sensor and the new one is the same. Both make use of the different penetration depths of wavelengths of incoming light to assign each photon at a pixel to red, green or blue without the need for a filter.
PIN diode structure
Unlike the commercial process, the new research prototype was built from two back-to-back amorphous silicon PIN diodes. The amorphous silicon was deposited on 4-in. glass wafers at temperatures of less than 300 °C, which is much less than that at which CMOS processing is done. The two diodes shared a P-type layer, so the final structure was N-I-P-I-N, with different colors of photons converted to an electrical signal at different depths.
To match the color response of the human eye, the investigators tinkered with the thickness and optical bandgap in the various layers. They first optically simulated and then characterized the devices to derive an optimized device structure. They built thin-film transistors out of the amorphous silicon to construct readout electronics for a 512 × 512 array of 100-µm pixels. They then used the array for color imaging.
Knipp acknowledged that the pixels are large and that the spatial resolution, which is limited by the top diode, is low. Work is under way to improve both parameters by patterning the top P-layer. Other efforts aim to put the sensors atop CMOS readout electronics, which should be possible given the technique’s relatively low processing temperature.
Such devices would be of particular interest for camera applications,” Knipp said.
MORE FROM PHOTONICS MEDIA