[Photograph removed – 21 October 2012]
The individual pixels in the sensors inside digital cameras cannot collect information from across the entire visible spectrum. To get a full colour image, it is therefore necessary to combine information from several different pixels, each of which has a coloured filter in front of it that acts to establish which part of the visible light range it observes.
Virtually all digital sensors rely upon the Bayer Pattern, invented by Kodak. This allocates half of all pixels to the green portion of the spectrum, with 1/4 devoted to red and blue, respectively. The process of combining the data mathematically is fairly resource intensive. This process is called interpolation. It can be done either using a generic of-the-shelf processor, which is cheaper per unit but not very fast or energy efficient, or with a custom chip, such as the DIGIC chips in Canon digital cameras.
Today, Kodak announced a new pattern for use in CCD and CMOS sensors. The new system uses both filtered and unfiltered pixel elements. These will record brightness data from across the entire spectrum. The new interpolation algorithms then use this panchromatic data to create a luminance channel, to which colour data is added using data from the filtered elements. Doing so may require much more processing power, which suggests that new custom chips will need to be designed.
The benefit of the new pattern is that it will supposedly double the sensitivity of sensors, allowing for better performance in low light. Given how small and inexpensive the lenses on cheap cameras and camera phones are, this is a very important design parameter. Of course, all this constant development in digital photography makes one a bit wary to invest $1000 or more in what is available this year. Chances are, the offer next year will be rather better. For this particular technology, it will probably be necessary to wait until the first quarter of 2008.
This is being discussed on Slashdot.
Also, on the photo.net forums
Here is an interesting demonstration of the low resolution nature of human colour vision, especially when it comes to blue light.
An alternative technology you might have heard of is a stacked photodiode developed by a company called Foveon and described here. They do away with filter patterns entirely, though there seems to be some trade-off in terms of extra noise, and they haven’t been very commercially successful so far.
Personally I think the biggest issue with modern digital cameras is dynamic range. I’d love to see someone develop a pixel with a logarithmic response, so that the camera would naturally capture a HDR shot. There’s some work on this in the Engineering Department here, but apparently it’s a very difficult problem.