Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Email Facebook Twitter Google+ LinkedIn Comments

Resolution Considerations for Color Imaging Technique

Photonics Spectra
Jun 2004
Choosing the best color-sampling approach is application-dependent.

Philip Merlo, Diagnostic Instruments Inc.

Among the methods for color image creation and sampling, two popular techniques in common use are single-shot color mosaic sampling and three-shot color sampling. Each has its own advantages and disadvantages.

With single-shot color mosaic sampling, a red, green or blue color filter is applied directly onto each pixel. The filters are most commonly applied in a repeating, four-pixel element called a Bayer filter pattern. To create a color image, a single exposure is taken, resulting in a sampling of only one of the primary red, green or blue colors at each pixel location. The two unsampled colors are then interpolated from adjacent pixels that have values for the color being calculated. (Note that the other colors do not contribute anything to the color being calculated.)

To construct a color RGB image using this sampling method, 66 percent of the intensity values must be calculated. Also note that the resolving element is the 2 × 2 Bayer filter pattern that was used to sample the image. This means that an image captured by a 2048 × 2048 sensor actually has resolution of only 1024 × 1024.

To understand how this happens, envision that we have created a perfect small ray of red light that falls entirely onto one red pixel with no other light falling on the sensor. First, that red pixel will very accurately record the light’s intensity value, but now the eight adjacent pixels, which had no red light falling onto them, will each have a value of red light calculated for them because of the interpolation. The resulting image will represent the one-pixel ray as being a nine-pixel ray.

Now imagine that our red ray fell onto a blue or green pixel; the resulting image would show nothing. Other artifacts also result from this sampling method. Thin white lines and extremely bright transition edges in images can appear to have color stripes caused by sampling and interpolation errors (Figure 1).

Figure 1. A one-shot color mosaic image is shown at the pixel level. Thin white lines and extremely bright transition edges in images can appear to have color stripes caused by sampling and interpolation errors.

Three-shot color sampling

Another method of color sampling is to position a color-changing filter element in front of the image sensor, then sequentially capture a red, a green and a blue image. The three sets of image data are combined pixel by pixel to provide RGB-sampled color at each pixel location. Because each color is sampled at each pixel, the resolving element of the system is the pixel, making the stated resolution of the system equal to that of the image sensor. This means that an image captured by a 2048 × 2048 sensor maintains its 2048 × 2048 resolution. Also note that the saved file size does not change, it just contains more measured data.

A drawback to this method is that, if the image is changing with time, the sequential image capture will produce an image with red, green and blue ghosts of your subject as it moves across the scene. Another possible concern is exposure and capture time. This method triples the time, so if time is an issue with a single shot, it will be more of an issue with the three-shot method (Figure 2).

Figure 2. A three-shot color image is shown at the pixel level.

As is true with any situation, the appropriate solution depends on your needs. If you have moving samples or need high throughput, such as in live-cell imaging, single-shot color mosaic cameras would be most appropriate. If your sample is fixed and you have additional time, such as when using specimen slides in histology, pathology or geology, you may benefit from the additional resolution available from three-shot color cameras.

Meet the author

Philip Merlo is vice president of sales and marketing at Diagnostic Instruments Inc. in Sterling Heights, Mich.; e-mail:

Sensor Basics

To understand the elements of sensor resolution, it is important to note that the smallest element on an imaging sensor is the pixel. It collects electrons that have been kicked into its potential well by incident photons during the exposure time. It then provides the digitizing component of the imaging system with a packet of charge. In a noncolor imaging system, the resolving element is equal to the pixel (not considering optical system degradation).

An electron kicked into the well by a 550-nm photon looks no different to the digitizer than one kicked into the well by a 450-nm photon. In digital RGB color camera systems, the color of the light is determined by sampling the light intensity in three bands of the visible spectrum: red (~530 to 700 nm), green (~460 to 600 nm) and blue (~400 to 500 nm). The color RGB digital image then typically consists of an array of data with three values for each pixel location corresponding to the red, green and blue color intensities for that location.

One surface of a nonconducting plate that is coated with many minute particles of photoemissive material that are insulated from one another.
Bayer filter patterncolor filterFeaturesmosaicSensors & Detectors

Terms & Conditions Privacy Policy About Us Contact Us
back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2019 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x Subscribe to Photonics Spectra magazine - FREE!
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.