Hi Adam
In the M83 introduction, you speak about how acquisition of LRGB data has changed over time with the introduction of CMOS chips (specifically, not binning the color data). This opens the door to another question that I would be interested in your take on, I hope it is not out of bounds because it is an acquisition rather than a processing question. That is: with modern OSC sensors, and with PIs ability to split out color channels from those images so that they can be processed individually, what is the residual deficiency of OSC imaging versus LRGB imaging? I am assuming that there is one, since the best imagers (and companies like Telescope Live) acquire LRGB images. Is it diminished sensitivity due to the Bayer filter (if so, that could be mitigated by acquiring more data)? Or does the Bayer filter cause a decrease in resolution that cannot be recovered? I know this question is sometimes debated in various forums, but as someone who has (relatively happily) done OSC imaging, I am interested in your opinion.
Thanks
Ed
Comments
Sorry, I didn’t mean to say that I shoot Lum’s during more
optimal seeing conditions for better sampling, I do it that way to improve (in
theory...) detail resolution. In real numbers, my best seeing gives me around 2.5”
fwhm’s and when average, 3.25-3.5”. So, if I shoot Lum’s when it’s best, and
r,g,& b the rest of the time, I may worsen the detail resolution of the RGB
a bit, but I’ll have a Luminance with better detail resolution than an RGB shot
on all nights. Best of a bad situation sort of thing.....
Cheers,
Scott