Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Resolution Considerations for Color Imaging Technique

Philip Merlo, Diagnostic Instruments Inc.

Among the methods for color image creation and sampling, two popular techniques in common use are single-shot color mosaic sampling and three-shot color sampling. Each has its own advantages and disadvantages.

With single-shot color mosaic sampling, a red, green or blue color filter is applied directly onto each pixel. The filters are most commonly applied in a repeating, four-pixel element called a Bayer filter pattern. To create a color image, a single exposure is taken, resulting in a sampling of only one of the primary red, green or blue colors at each pixel location. The two unsampled colors are then interpolated from adjacent pixels that have values for the color being calculated. (Note that the other colors do not contribute anything to the color being calculated.)

To construct a color RGB image using this sampling method, 66 percent of the intensity values must be calculated. Also note that the resolving element is the 2 × 2 Bayer filter pattern that was used to sample the image. This means that an image captured by a 2048 × 2048 sensor actually has resolution of only 1024 × 1024.

To understand how this happens, envision that we have created a perfect small ray of red light that falls entirely onto one red pixel with no other light falling on the sensor. First, that red pixel will very accurately record the light’s intensity value, but now the eight adjacent pixels, which had no red light falling onto them, will each have a value of red light calculated for them because of the interpolation. The resulting image will represent the one-pixel ray as being a nine-pixel ray.

Now imagine that our red ray fell onto a blue or green pixel; the resulting image would show nothing. Other artifacts also result from this sampling method. Thin white lines and extremely bright transition edges in images can appear to have color stripes caused by sampling and interpolation errors (Figure 1).


Figure 1. A one-shot color mosaic image is shown at the pixel level. Thin white lines and extremely bright transition edges in images can appear to have color stripes caused by sampling and interpolation errors.

Three-shot color sampling

Another method of color sampling is to position a color-changing filter element in front of the image sensor, then sequentially capture a red, a green and a blue image. The three sets of image data are combined pixel by pixel to provide RGB-sampled color at each pixel location. Because each color is sampled at each pixel, the resolving element of the system is the pixel, making the stated resolution of the system equal to that of the image sensor. This means that an image captured by a 2048 × 2048 sensor maintains its 2048 × 2048 resolution. Also note that the saved file size does not change, it just contains more measured data.

A drawback to this method is that, if the image is changing with time, the sequential image capture will produce an image with red, green and blue ghosts of your subject as it moves across the scene. Another possible concern is exposure and capture time. This method triples the time, so if time is an issue with a single shot, it will be more of an issue with the three-shot method (Figure 2).


Figure 2. A three-shot color image is shown at the pixel level.

As is true with any situation, the appropriate solution depends on your needs. If you have moving samples or need high throughput, such as in live-cell imaging, single-shot color mosaic cameras would be most appropriate. If your sample is fixed and you have additional time, such as when using specimen slides in histology, pathology or geology, you may benefit from the additional resolution available from three-shot color cameras.

Meet the author

Philip Merlo is vice president of sales and marketing at Diagnostic Instruments Inc. in Sterling Heights, Mich.; e-mail: philmerlo@diaginc.com.



Sensor Basics

To understand the elements of sensor resolution, it is important to note that the smallest element on an imaging sensor is the pixel. It collects electrons that have been kicked into its potential well by incident photons during the exposure time. It then provides the digitizing component of the imaging system with a packet of charge. In a noncolor imaging system, the resolving element is equal to the pixel (not considering optical system degradation).

An electron kicked into the well by a 550-nm photon looks no different to the digitizer than one kicked into the well by a 450-nm photon. In digital RGB color camera systems, the color of the light is determined by sampling the light intensity in three bands of the visible spectrum: red (~530 to 700 nm), green (~460 to 600 nm) and blue (~400 to 500 nm). The color RGB digital image then typically consists of an array of data with three values for each pixel location corresponding to the red, green and blue color intensities for that location.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media