Regular digital cameras color filter arrays that put the pixel sensors so that it only detects a particular color. Therefore, there is a trade-off, because without the filter array, you would have a significantly higher-resolution black-and-white image.
However, most scientific digital cameras work differently. Rather, the usual technique is to put filters over the entire image, and switch out the filters if a different color is required. Subsequent differently-filtered images can then be combined into one image. But there's still a trade-off: putting in filters decreases the dynamic range of the camera significantly. Additionally, this approach does not require one to use the usual red, green, blue scheme; e.g., and IR filter would allow one to emphasize different features.
It is very likely that they simply didn't bother to put color filters in the cameras. Why would they? The colors are probably rather drab on the comet anyway, and it unnecessarily increases the complexity of the cameras (which is already a quite complicated panoramic array).
No comments:
Post a Comment