Editor’s note: Neal Sumerlin is a retired chemist and astronomer, neurotypical ally, and guest contributor who has been invited to share his blog at The Aspergian since so many autistics have a special interest in the cosmos. There is an article about his journey with site founder Terra Vance featured here. You can read more of Neal’s “Starstruck” blogs here.
One of my very favorite web sites is NASA’s Astronomy Picture of the Day, showcasing gorgeous images ranging from the Earth at night to a “baby picture” of the universe at 380,000 years of age. Each week, viewers are given the opportunity to vote for the APOW-astronomy picture of the week-and if the candidates include an emission nebula, that is very often the winner in a landslide. These are glowing clouds of gas powered by hot and energetic stars in their midst, and the brightest and most familiar of these is the Great Orion Nebula.
After the Big Dipper, Orion is probably the most familiar grouping of stars in the sky to Northern Hemisphere observers. At this time of year, it is not fully above the eastern horizon until after midnight, but it rises a little earlier each night and is a familiar sight shortly after dark during the winter months. In a dark sky, one can easily spot the soft glow of the nebula in Orion’s sword, hanging from the three stars of his belt.
Telescopes of course give us a more detailed and close-up view, and digital cameras allow those of us without access to world-class instrumentation to share the full glory of the object.
But if you do a Google Image Search for “Orion Nebula”, you get not only a variety of orientations and angles of view-the equivalent of zooming in or zooming out on the same object-you also get a bewildering variety of colors. Are any of these “real”? Forgive me if I remind you of a former president, but it depends on what the meaning of the word “real” is.
I am sometimes asked whether the images returned by the Hubble are what we would see if we were actually “there”: hanging in space at the appropriate location, observing with our own two human eyes. For several reasons, the answer is no.
The Limitations of the Human Eye
First of all, our eyes just aren’t that sensitive. We require more light that is generally available in deep interstellar space, and while very bright nebulae such as Orion would certainly be visible (it is after all visible from over 1300 light years away), many of the objects familiar from Hubble images would not be. Our eyes are very good at detecting motion (good for avoiding predators), which means they are not so good at building up an image of a single dim stationary object over time, collecting enough photons to reveal the otherwise invisible.
But the main difference between that Hubble image and what your eyes would see is in the color.
Here is a quick reminder of something you may not have visited since middle school. Our eyes are sensitive only to a narrow band of the entire electromagnetic spectrum. What we call visible light ranges from violet at one end of our range of sensitivity to red at the other.
And it’s even more complicated than that. The color detectors in our retinas are of three types, sensitive to wavelengths centered roughly on blue, green and red. They do not respond with equal sensitivity to all wavelengths of light, and their areas of sensitivity are not equally spaced from each other. All of this means that while our eyes are remarkably versatile detection devices, they have some limitations as scientific instruments!
Enter the digital camera, or more precisely, the CCD (charge-coupled device) camera. A CCD is simply a light-sensitive silicon chip that can turn light into electricity, and create an image in the form of digital information. It is at the heart of everything from your point-and-shoot digital camera to your cell phone camera to the webcam you use to visit with your distant friends over the internet. The CCD actually has a grid of tiny picture elements (pixels), and the more of these there are, the finer the detail in the final image.
How CCD Cameras Take Color Images
A CCD is not color sensitive. Although it responds to some colors more strongly than others, essentially it is similar to black-and-white film, just recording the intensity of visible light falling on it. Your digital camera creates color by putting a tiny color filter on each pixel so that it preferentially detects one of three colors. The camera adds the resulting three images together to create color. The disadvantage is that your resolution (the fineness of detail which depends on the number of pixels) is only one-third of what it would be otherwise. But, given the ever-lowering cost of CCD chips, this is much cheaper than the alternatives.
How Astronomical CCD Cameras Take Color Images
The objects imaged by telescopes are not changing! Or at least they don’t appear to change from our perspective in any short period of time. So we can take an image now, and then another image a few minutes later, and be confident that the object will not have changed in between.
This lets us create color images in a way that takes full advantage of the pixel count of our detector. If we wished to create something close to what our eyes would see if they were sensitive enough, we would take one image each through red, green and blue filters, then combine these to create a color image. Four-color magazine printing (which adds black) works in much the same way.
So Do the Hubble Images Show “Real” Color?
Let me rephrase the question: Do the Hubble images show visual color-what we would see with our own eyes? No. Do they show real color? Well, they aren’t making up any information that isn’t there to start with! It’s how the information is processed that makes the difference.
The filters that the Hubble uses are not just colored pieces of glass. Rather than passing a wide range of wavelengths centered around red, for example, they pass only a very narrow band centered on very specific wavelengths. This allows us to reject a great deal of stray light from a variety of objects (sunlight reflecting off the moon or the Earth) and create much crisper images.
Most of what is in the universe is hydrogen, and so most of what glows is glowing hydrogen. Hydrogen emits light of several different and very specific wavelengths, but what is called hydrogen-alpha emission dominates. This is a deep red color, and many images of nebulae are taken solely through this filter, then rendered as red images. Here, for example, is an image of the Eagle Nebula taken only through a clear filter and a hydrogen-alpha filter.
And here is an image of the same Eagle Nebula taken with what is called the “Hubble Palette”. This image uses three filters. A hydrogen-alpha filter passes red light emitted by hydrogen atoms, but it is rendered green here. An SII filter passes light that is even more deeply red than hydrogen-alpha, and here it is the red part of the image. It is emitted by ionized sulfur atoms. Finally, doubly ionized oxygen atoms are passed by an OIII filter. This light is green, but here it is rendered as blue.
It makes for a beautiful image, doesn’t it? But the reason for using these specific filters is both practical and scientific. Emission from different atoms gives us information about different parts of the nebula, and rendering them as very distinct colors emphasizes those differences. And these emission lines are fairly bright. Hydrogen is by far the most abundant element present in these clouds, but even elements present in far lower abundances can shine relatively brightly.
Are there alternative palettes? Sure there are-you could make up your own if you wished! One of the more popular is the so-called CFHT (Canada-France-Hawaii Telescope) because it more closely resembles old images created on photographic film. CFHT uses hydrogen-alpha for red, OIII for green, and SII for blue. Here is the Eagle Nebula rendered via the CFHT palette.
All of these are simply different ways of depicting the same object. All of them are “real”. All of them convey information, albeit in different forms. And to my mind at least, they are all beautiful!