Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Email Facebook Twitter Google+ LinkedIn Comments

Sensor Sees in Dim Light
Jun 2007
ROCHESTER, N.Y., June 14, 2007 -- Eastman Kodak Co. today introduced an image sensor that it said is at least twice as sensitive to light as those currently found in any digital camera.

Image sensors act as the "eye" of a digital camera by converting light into electric charge to begin the process of taking a photo. Kodak said its new sensor technology, invented by Kodak scientists John Compton and John Hamilton, provides a 2X to 4X increase in sensitivity to light (from one to two photographic stops) compared to current sensor designs. KodakSensor.jpg
A Kodak worker in the lab. (Photos courtesy Eastman Kodak Co.)
“This represents a new generation of image sensor technology and addresses one of the great challenges facing our industry -- how to capture crisp, clear digital images in a poorly lit environment,” said Chris McNiffe, general manager of Kodak’s Image Sensor Solutions group, in a statement.

The new sensor is designed to replace industry-standard image sensors based on the "Bayer pattern," an arrangement of red, green, and blue pixels that was first developed by Kodak scientist Bryce Bayer in 1976. Today, almost all image sensors in digital cameras, camcorders, and scanners are based on Bayer's design to create a color image, with half of the sensor's pixels used to collect green light, and the remaining pixels split evenly between sensitivity to red and blue light. After exposure, software reconstructs a full color signal for each pixel in the final image.

Kodak's new technology adds panchromatic, or "white light" pixels to the red, green, and blue elements that form the image sensor array. Since these pixels are sensitive to all wavelengths of visible light, they collect a significantly higher proportion of the light striking the sensor. By matching these pixel arrangements with advanced software algorithms that are optimized for these new patterns, Kodak said, users can realize an increase in photographic speed, which improves performance when taking pictures under low light.

"One way that helps to think about this is to look at it in terms of luminance and chrominance. In the original Bayer design, the green pixels are used to recover most of the luminance information from the image. Now, we are using panchromatic pixels -- which are more sensitive than green pixels, because none of the photons get filtered out or wasted -- to act as the luminance. This gives us a more sensitive luminance channel in the final image, which raises the sensitivity of the entire sensor," Hamilton said in an entry to Kodak's blog, "A Thousand Nerds."

A comparison of digital camera image sensor capabilities in low-light situations. Photo on the left was taken with current technology (Bayer pattern); the one on the right with Kodak's new image sensor. Both photos, cropped from the original images, were taken at 1/10 sec. ISO 1000.
"In a low-light situation, these new patterns will produce a lot less color noise than a Bayer pattern sensor. You can run the shutter faster, which gets rid of a lot of motion artifacts. It will cut down on camera shake or, if you're taking a picture of a moving object there will be less blur," Hamilton said in the blog.

The technology also can be used with both CCD and CMOS image sensors, Kodak said.

"Samples of the first sensor with this technology should be available in the first quarter of 2008. Once that is available, some additional time will be needed by camera manufacturers to design, develop and manufacture a camera using this sensor. So we're hoping it's not too much longer after that," said Compton in the blog.

Kodak said it is currently developing CMOS sensors using the technology for consumer markets such as digital still cameras and camera phones, but that its use could be incorporated into all the company's image sensors, including products for applied imaging markets such as industrial and scientific imaging.

For more information, visit:

The difference between any color and a reference color having equal luminance and a specified chromaticity.
The attribute of visual experience that can be described as having quantitatively specifiable dimensions of hue, saturation, and brightness or lightness. The visual experience, not including aspects of extent (e.g., size, shape, texture, etc.) and duration (e.g., movement, flicker, etc.).
Denoting the use of binary notation; i.e., the representation of data by bits (1 or 0).
In optics, an image is the reconstruction of light rays from a source or object when light from that source or object is passed through a system of optics and onto an image forming plane. Light rays passing through an optical system tend to either converge (real image) or diverge (virtual image) to a plane (also called the image plane) in which a visual reproduction of the object is formed. This reconstructed pictorial representation of the object is called an image.
Electromagnetic radiation detectable by the eye, ranging in wavelength from about 400 to 750 nm. In photonic applications light can be considered to cover the nonvisible portion of the spectrum which includes the ultraviolet and the infrared.
Luminous flux emitted from a surface per unit solid angle per unit of area, projected onto a plane normal to the direction of propagation. Also known as brightness and luminous sterance.
The technology of generating and harnessing light and other forms of radiant energy whose quantum unit is the photon. The science includes light emission, transmission, deflection, amplification and detection by optical components and instruments, lasers and other light sources, fiber optics, electro-optical instrumentation, related hardware and electronics, and sophisticated systems. The range of applications of photonics extends from energy generation to detection to communications and...
Contraction of "picture element." A small element of a scene, often the smallest resolvable area, in which an average brightness value is determined and used to represent that portion of the scene. Pixels are arranged in a rectangular array to form a complete image.
1. A generic term for detector. 2. A complete optical/mechanical/electronic system that contains some form of radiation detector.
Terms & Conditions Privacy Policy About Us Contact Us
back to top

Facebook Twitter Instagram LinkedIn YouTube RSS
©2017 Photonics Media
x We deliver – right to your inbox. Subscribe FREE to our newsletters.