Monday, March 15, 2010

High Resolution means a Higher Price

There is a theorectical limit to how much light will be falling on an image chip, that is the resolution that can be provided by the optics. Advanced imagers will often quote specifications which are perhaps a little difficult to comprehend by beginners like me. Resolution in ARC seconds, etc. They will talk about how much resolution could be oversampling by an image chip. In other words in a digital realm there are only so many dots of resolution being painted on the image chip and if the chip is sampling to densely these dots or the basic image arriving, it's "oversampling". You can reduce the number of dots or light buckets you are getting by grouping them using a technique called "binning" which reduces the resolution of the image chip but increases the sensitivity of the light because it's summing perhaps 4 image sensors and grouping their sensitivity creating a larger pixel or light bucket in a virtual digital realm. This binning creates more sensitivity. Is it good or bad? Compared to adjusting the gain on a CCD camera like a common ordinary Canon EOS it would seem to be a little of both. Binning is good because it's increasing the gain without noise. It's bad because it's reducing the resolution of the sample. But is that bad?

HIGH RES IS BETTER
One of the more experienced imagers in the group at the last meeting brought up the fact that if you blow up an image to zoom in and make a print, it's usually better to have more detail. To get more detail you're better off with a higher resolution sample to begin with. But this can become a case of experimenting and determining if the higher resolution is useful and helps you're images, displays, prints etc. And of course you need the digital resolution in the camera to begin with, a more expensive chip with a higher resolution. If you don't have the resolution to play with, you're not going to get a higher resolution sample (unless you use more power and overlap photos, but we aren't talking about that).

It becomes more of an experimental thing, something to play with and learn, but also taking into account what others have learned.

There are things about the human's perceive an image that allow us to fill in detail where it's missing. We have the ability to fill in details. This happens more with moving images than stills however. But we can still fill in details, so we can sometimes use this characteristic and reduce details, processing and cost without a noticable drop in quality.

In some cases things like color can be cheated on and may add to an image with other details, like chroma (or brightness in black and white) providing the details. We can see this with older television systems. With the advent of Color TV, color had to be added to a previous system. It required a "color component" to be added to the signal, but they didn't have the broadcast space to put in all the color details. It was discovered that all the details didn't need to be added and they could "cheat" a bit adding less color and the way the display systems and our eyes worked, we'd fill in the details. This saved on color space, and resolution. Allowing color to be added with a thing called "color burst".

The color burst information was actually 4 times less detailed than the black and white picture, but the average viewer never noticed this. Color burst was supplying enough color detail to give our eye an idea what colors were in the image and as the image was moving there was more room for a loss of detail without viewers noticing a resolution loss. In the same way we might be able to reduce the red, green or blue components of an astrophotograph by 4 fold and not notice the difference. The overall resolution and in the composite photograph we end up with not might look the same to our eyes. Using binning we may increase the brightness of a channel and decrease resolution, but the human eye we might not notice the difference, especially on a computer display.

In NTSC earlier Television
Actually two fields of reduced resolution were sent, called interlaced fields. These were being delivered in the old days, a slice and dice of the image which actually had slight movement between each 1/60th second field. The Television displays would mush all these signals together and the 60 interlaced field per second would merge into this 30 frames per second image on the glowing phosphors of the older television display/tube. (I'm using round numbers of 60 and 30 FPS.) This blurring would not be noticed unless you somehow digitally grabbed a still image and examined it carefully. If you took a digital still of an image, you'd see the blur artifacts. This can still happen to still captures from interlaced systems, unless they sample only one field of the 2 field image and effectively remove half the vertical resolution a still from an interlaced signal will likely have motion artifacts.

But motion was more of a issue with moving images and interlaced systems will likely not cause a problem with a still image such as one from an astrophotography. But we don't use interlaced systems much in astrophotography. Modern systems are using more of a progressive frame capture type of video grabbing technique (unless you're using an NTSC video camera.) A webcam would likely be giving you progressive images.

A DIGRESSION WITH DISCUSSION OF MOTION VIDEO vs FILM
In some countries they had a different system called PAL, which had single frames or images, like a still arriving all at once. These were more like traditional film movie houses which would display 24 frames each a full photo transmitted to a screen seperated by a moment of black in between. In movie houses and perhaps on PAL there was this black space in between the photos being delivered that gave a different "feel" to the film being viewed. More like a flip book for some viewers, a slight artistic difference in perception. Of course in big theatres having a larger resolution on the film role helped a lot as well. American TV had a different more fluid feel for motion because it was sending more signals with movement in the image. At apx. 60 frames per second you'd capture a lot more fluid motion and movement than at 24 fps in film. So sports movements as in the throw of a football would look better on a television broadcast from a motion perspective than on a filmstrip of 16mm film at 24 frames per second. But this is in regards to motion perception, and I'm getting off the subject of astronomy stills, so let's return to still photos.

REDUCING COLOR RESOLUTION INCREASING BRIGHTNESS
In the same way reduced color (or any channel of color) can be captured today's modern astrophotography cameras using a technique called BINNING. If you BIN the RGB channels you will reduce the color resolution in your photograph and get a brighter color with less time spent exposing the image. Less resolution may be in the digital portions of the file that are in color, but the end product a stacked photo with R, G, B, and chroma channels(without binning). If the Black and white resolution is higher, the eye might not notice the loss of resolution in the overall photograph. Or alternately you could bin the black and white (chroma channel) and use higher resolution ("unbinned") RGB. Reducing parts of the image may not affect what we finally perceive on the monitor screen, especially for web or digital displays. Once the channels are stacked together the eye may take the detail from one of the channels and see that detail and fill in the missing detail from other channels. Should you capture higher detail in the color channels or lumanence channels? I can't answer that question, but with some experimentation perhaps someone with the equipment could provide some samples and analyse and come up with a conclusion. The answer likely will depend in the type of image you're shooting.

If you're shooting a star cluster or something with very little color in it, you may want more resolution in the black and white components of the image. If you're shooting dust clouds in a nebula, you may want more color resolution.

Some imagers do a digital binning of the Chroma channels as well, meaning the black and white channels to brighten up those and leave the R, G and B channels as their original resolutions. This will brighten up the black and white portions of an image, but reduce it's resolution by four fold if it's a 2 by 2 bin, grouping 4 pixels into one. Depending on the sample of the image to begin with, you may not even notice a loss of detail, but gain a lot of extra light and a bunch of dust and bright light can appear in the photo. Perhaps you're camera captured so much detail to begin with a 2 by 2 bin of the black and white really provided a lot more detail and in effect a brighter exposure after the fact. More details compared to almost no details. Higher brightness via binning can always be added after the fact in image processing programs. Higher resolution (getting rid of binning) cannot be done in post.

TRADITIONAL SCANNING
In traditional imaging systems if you're doing print a higher sample resolution will often provide more in the end result, something of a higher quality. If you can sample an image with a scanner at 4x the saved image resolution, even if it's saved at a lower resolution, you will often end up with a better picture. This is a trick of the trade that publishers have learned. It likely will work i astrophotography as well. Higher resolution may be better, but it may not be as noticable unless you're doing large prints of you're images. Binning may save you time in image gathering and provide enough resolution for your online web images without anyone even noticing the loss in detail.

Higher resolution of course means more money for the camera and possibly less "binning" which could translate into longer exposures. Longer exposures could be required to get the same brightness because you aren't binning. And that means a more expensive mount or longer tracking times to get the same results. This all translates into spending more money. Higher resolution means a higher price, there is no free lunch and in the case of astrophotography, you probably will be factoring that extra expense into added costs for the mechanical mount and optics of the telescope, a darker sky site location, etc.

No comments:

Post a Comment