Tuesday, December 15, 2009

This year's Nobel Prize: CCD

Getting a digital camera for Christmas? Before you fire it up to capture Uncle Wally's fateful fifth trip to the punch bowl, take a moment to picture this: You've got a genuine scientific marvel in your mitts. In fact, it took nothing less than two Nobel prizes and a revolution in physics in order for you to point and shoot.

Why? Because to take a filmless picture, your camera or camcorder relies on, um, quantum mechanics. In particular, it exploits the fact -- revealed by Albert Einstein himself -- that a beam of light, which behaves like a wave in some circumstances, acts like a bunch of separate particles in other circumstances. (If that seems infuriatingly contradictory, suck it up. It's just how we do things in this cosmos. Or go complain to the management.)

The individual particles, called photons, come in a wide range of energies. Visible light has enough so that when its photons slam into something, such as a sheet of specially fabricated semiconductor material in a digital camera, they kick electrons right out of the stuff, producing an electrical charge at the crash site. Explaining this phenomenon, known as the photoelectric effect, got Einstein his Nobel.

In most consumer cameras, the photoelectric action takes place back behind the lens, when the light reflected from Uncle Wally hits a "charge-coupled device," or CCD. A typical CCD contains a light-sensitive semiconductor rectangle, usually smaller than a fingernail, crisscrossed by a grid of tiny channels that divide it into several million separate picture elements, or pixels.

Each pixel emits a different number of electrons, depending on how many photons struck it, and it stores those electrons in a gizmo called a capacitor, which functions like a bucket. After the exposure is over, the CCD circuitry empties the millions of pixel buckets one by one, records the amount of charge in each, and transfers the resulting mosaic to a processor that converts it into digital form -- all in a fraction of a second. Not surprisingly, the guys from Bell Labs who invented the CCD won a 2009 Nobel Prize in Physics.

Of course, if that were all that happened, you'd only have a black-and-white picture. But to photograph your gift from Aunt Myrna, who somehow found a sweater so lurid that it can be seen from space, you want color. There are a few ways to get hues you can use, and they all rely on the convenient truth that all the shades we recognize can be represented by various proportions of red, green and blue, the "RGB" of computer monitor fame.

Unless you've got a high-end camcorder, your gear probably has a single CCD whose grid is covered by an exactly matching grid of color filters arranged in a repeating pattern. For every two-by-two set of four pixels, one is covered by a blue filter, one by a red filter and two (at opposite corners) by green filters. Doubling up on green is needed because the human eye evolved to be disproportionately sensitive to that color, which is right in the middle of the sun's visible spectrum.

The CCD records the electron count on each set of four pixels, and then the camera's on-board computer compares the value of each pixel in the foursome to that of its three neighbors to calculate the "true" color of each one. Considering that these are software-generated approximations and not actual measured colors, the accuracy is astonishing. And the range is equally impressive: customarily at least 256 levels of R, G and B in each pixel, for a total of 16.7 million different colors.

If you've got a still camera that cost more than a case of cat food, it probably has 6 million to 25 million pixels, or six to 25 megapixels in photo argot. How does that stack up to film? The finest 35-mm film in the best cameras using incomparable lenses produces images that can "resolve" (that is, show the difference between) somewhere around 90 million separate spots. A lot of that detail, however, would never be noticed by the human eye unless the photo was blown up to drive-in movie dimensions. A reasonable benchmark is that a good film picture is equivalent to about 20 megapixels.

But the whole megapixel mania that is used to market digital cameras can be awfully misleading, especially in the case of the pocket-size models. For one thing, if you don't have a good enough lens or a CCD sophisticated enough to capture fine differences in contrast and tone, it doesn't matter how many megapixels you've supposedly got. You'll just get a more expensive blur.

For another, most people don't enlarge their photos to the point at which the difference between six and 10, or 12 and 16 megapixels is important. And if you pass your pictures around on the Internet, they probably won't display at much over 100 dots per inch anyway -- about one-third the resolution of an ordinary print. For most folks, gross pixel count is more about self-image than photographic image. But who needs an ego boost when you've mastered quantum mechanics?

No comments:

Post a Comment