SJAA Ephemeris July 2009 | SJAA Home | Contents | Previous | Next


FITS and Starts

Paul Kohlmiller


“The camera’s temperature? Color is cheaper? A bias is a good thing? Can any of this be true? ”


This image is a 90-second exposure of M51 taken May 16, 2009. The telescope is a Meade LX 200 Classic and the imager is an SBIG ST-7XE. The left half of the picture shows the image before using the “dark” exposure to subtract the noise. The camera was cooled to -5 Celsius. While this picture isn’t going to put the Hubble Space Telescope out of business, I think it is interesting to note a few things visible here. First, M51’s companion, NGC 5195, has a bar structure that might be near vertical in this orientation. M51 has two arms but just above the galactic core one of the arms is partially bifurcated. To the left and the right of the core, between the two arms, are some structures that are sometimes called “feathers” which are actually associated with star forming regions. The foreground stars are not crisp, point-like objects which is caused by inaccuracies in the telescope’s tracking. Stacking multiple images might improve the quality of the final image. Photo courtesy of the author.


Astrophotography often requires more work than simply taking an image and printing it. Unless you are taking a picture of one of the brighter objects in our solar system, the image is relatively dim. This means the exposure time is measured in many seconds or minutes or more. As a result, the camera, probably using a CCD imaging chip, has to be very sensitive. Increasing sensitivity often leads to an increase in noise. If you are old enough to remember vinyl records, try to recall what happened when you turned up the volume - the sound was louder but so were the scratches and bumps. Second, the longer length of the exposure creates more opportunities for things like cosmic rays to leave artifacts in the image.

To solve the problems caused by the longer exposure, most solutions involve taking even more exposures. These extra images are called bias, flat or dark images. A bias image is a readout of the CCD after a shutter-closed exposure of 0 seconds. This is sometimes called a zero image. A flat field is an exposure taken of a plain white surface, sometimes using the inside roof of a dome. No dome? Then take a towel (one without a pattern) and cover the scope. This gives a method for removing differences that might occur between one pixel and another. Since a filter can introduce differences, flats must be taken for each filter used. A dark is simpler but is kind of a combination of a bias and a flat image. The exposure is taken with the shutter closed (like a bias image) but it is taken for the same length of time as the real image. (To see the impact of using a dark exposure, see the picture on page 6.) Bias, flat and dark images are all used to do process images - a procedure that is often called reduction. In one recent data reduction process, a number of bias, flat and science images totaling 100 Mb of data were used to create a 100 Kb JPEG file so the word “reduction” is apt.

Astronomical images often use a file format called FITS - Flexible Image Transport System. A FITS file is usually very large. In one experiment, a FITS file saved as a bit-mapped (.bmp) file was half the original size. A JPEG (.jpg) version of the file with 20% compression was 1/8 of the FITS file size. In addition to the extra image information, the FITS file includes a lot of other information such as the length of the exposure, the filter used, and even the temperature of the camera. So far I’ve found about 60 attributes saved in the FITS file.

Wait a minute - the temperature of the camera? Another method for reducing the amount of noise that is introduced by the electronics of the CCD imager is to cool the camera - lower temperature means less noise.

Now that you have dark, bias and flat exposures you just need the one actual image file, sometimes called the science image. Well, not so fast. Most astronomical CCD images are monochrome. If you want a CCD that produces color images it might be cheaper. Really, color is cheaper? Well, there is a price to pay for a color imager. If you have a TV that is the old CRT style, take a magnifying glass (a jeweler’s loop works) when the TV is off (or else you might go blind) and look at the dots on the screen. You should be able to see dots of different colors: red, green and blue. These are the primary color of light, also called the additive primary colors. You might be used to the primary colors of pigment used in paint or crayons also called the subtractive primary color: yellow, red (actually a shade of magenta) and blue (more specifically, cyan). If you have an LCD panel TV or an LCD monitor on your computer, use the magnifying glass on a portion that appears to be white. You will again see the red, green and blue dots. A color imager has to use individual pixels (meaning “picture elements”) that are sensitive to red, green and blue light. This means that, all other things being equal, a color imager uses 3 pixels to create one color pixel. So a color astronomical CCD has only one third the number of pixels as a monochrome CCD (again, assuming everything else is the same - it rarely is). So the astronomer taking pictures usually has to take 3 images to make a color picture: one with a red filter, one with a green filter, and then one with a blue filter. Since all three of these images are likely to be dimmer than an image taken without a filter, an unfiltered image is also taken to get the correct brightness or luminance for the image. These images, likely many of these aligned and stacked, are combined to create the final color image.

Of course, if a monochrome image is desired, multiple color images are not needed. Even more interesting, the filters don’t have to be red, green and blue. Narrowband filters that preferentially allow the light that comes from nebulas can create interesting images and block out most city-based light pollution as well.

There are still other things that might need to be done before completing the picture. CCDs often have defects called column defects where an entire column is always hot (bright) or cold (dark). These can be corrected using a process called dithering where the object is moved within the frame of the image so that when the images are stacked the defect is effectively masked.


Previous | Contents | Next