LRGB question

Hi Adam

In the M83 introduction, you speak about how acquisition of LRGB data has changed over time with the introduction of CMOS chips (specifically, not binning the color data).  This opens the door to another question that I would be interested in your take on, I hope it is not out of bounds because it is an acquisition rather than a processing question.  That is: with modern OSC sensors, and with PIs ability to split out color channels from those images so that they can be processed individually, what is the residual deficiency of OSC imaging versus LRGB imaging?  I am assuming that there is one, since the best imagers (and companies like Telescope Live) acquire LRGB images.  Is it diminished sensitivity due to the Bayer filter (if so, that could be mitigated by acquiring more data)?  Or does the Bayer filter cause a decrease in resolution that cannot be recovered?  I know this question is sometimes debated in various forums, but as someone who has (relatively happily) done OSC imaging, I am interested in your opinion.

Thanks

Ed

Comments

  • It isn't just a bandpass issue with the bayer filters...they generally have too great an overlap in transmission which leads to less color contrast. So that is one. Two, as you mention is the transmission efficiency of the filters... that hurts in a number of different ways because people tend to take short exposures- which correlates with noise issues. The fact that the filtered elements are not adjacent to each other has other issues in addition to resolution. When you are undersampled- I think you can see how this might play a role in weird color results for stars. But take cosmic rays and hot pixels- since it depends on which pixel that transient event occurs... it leads to all kinds of color weirdness (artifacts). VNG interpolation wreaks havoc with transient events. This means images are not as clean. With regards to interpolation- the right way to take care of the issue is to do the CFA drizzle. It is shown in the SPCC documentation how NOT doing this can directly leads to miscalibration of colors. This is a large additional overhead for the processing of images. 
    I tire of writing more... bottom line OSC comes with extra need for attention to detail and overhead in processing (and perhaps more time in acquisition). There are some benefits to OSC compared to mono-filtered images- say a fast moving comet. Otherwise...I have never been convinced of the extolled virtues of OSC because it requires MORE know-how ...not less- to get good results.

    None of the above is about L- RGB image. Today the "L" is not really necessary. It used to be a timesaving trick by virtue of two facts- 1. There is a benefit to on-chip binning with respect to readnoise. 2. You can take advantage of the fact that we perceive detail more sensitively than color- or if you do not like that statement- many astronomical objects vary in detail much more then color. Take the Lagoon Nebula... the thing is RED and the variations in RED are quite large compared to the variations in detail within.

    Today L is obviated by #1. You cannot do on-chip binning with CMOS and get this benefit. #2 is still utilized in some processing sense...but not as a acquisition invested technique.

    -the Blockhead

  • Thank you.  Not sure I completely follow every one of your points, but the passion of the answer speaks for itself.  Could I just ask you to clarify one thing, for someone thinking about switching away from OSC?  You say that today "the "L" is not really necessary".  Are you saying that with modern CMOS sensors, without binning, there is no advantage to aquiring L images?  Just spend all the acquisition time on R, G and B?

    Ed
  • Correct.
    There is no on-chip binning with CMOS sensors. So the time-savings benefit of acquiring L and RGB binned no longer applies. This obviates the need of L since the RGB will be acquired at the same binning. If you acquire L data that is brighter than your RGB... you are not using your time wisely. So just acquire straight up RGB. For an additional boost you can combine (integrate)  the R,G, and B masterlights to generate an L. I show this some videos. However given enough frames the improvement will be small (though you will never hurt yourself doing this).

    Think carefully about my points above and ask specific questions about what does not make sense. I am happy to address them one by one.

    -the Blockhead
  • You are a patient teacher :)
    Really the only part that confused me was the end: You say that most objects vary in detail more than in color, but then when you gave the example of the Lagoon Nebula, you say the variations in red are larger than the variations in detail.  Seems contradictory.  But either way, how does that impact the decision not to acquire luminosity images?

    Ed
  • That was a bad error. I confused in my head the idea of large swaths of red (meaning little change in degrees of red across large areas). 

    So it used to be by acquiring color data you  saved time by binning. You could acquire slightly fuzzier color data and apply to L images (like the Lagoon) and it would be indistinguishable from a high resolution capture of straight up RGB. But today there is no binning and no benefit to L in this sense. That is how it impacts the decision not to acquire.. there isn't a benefit. 

    The idea of applying fuzzy color data to a detailed luminance is an eye-brain effect you can look up in the literature. 

    -the Blockhead 
  • Forgive me if I have this all wrong but I thought the main reason for binning was to achieve the correct sampling for the focal length of your telescope. I have a 12.5" RCOS RC which has a focal length of (approx) 2850 and I use a QHY268m which I bin at 3x3. Am I wrong doing this ? 

    Ron Havelock 
  • FWIW, I have a similar setup, 2350FL and 268M plus poor seeing, so oversampled by a factor of 2 or more, but recently I compared processing of binned and non-binned data, and decided non-binned was better.

    Also, because my seeing is poor and variable, my strategy is to shoot Lum's when seeing is better than average, and R,G, & B when not. Theory being, the LRGB will have better resolution than an RGB shot on all nights would. The trade-off is less color data, but I'm still getting 4+ hrs per color channel, so seems like a fair deal.

    Cheers,
    Scott
  • Thanks Scott, I will give it a try later in the year when I get longer nights (53 N). People have different opinions on this, I was hoping Adam might comment but maybe not. 
  • edited July 2024
    Like FWHM, I find that people get overly concerned about this. Being oversampled will have  small S/N hit in terms of sampling. However- hasn't everyone claimed just how low noise CMOS sensors are? Now there is a question? I do not understand. 

    The word "correct" is particularly context laden. What does this mean? It used to mean there was an inefficiency in spreading the light out because of the noise. Again, everyone under the moon says the noise has vanished (ok. a little hyperbole... but consider all of that talk about not even calibrating data anymore). By not spreading the light out, you certainly will reduce the file sizes and this can be helpful. There certainly isn't an improvement in resolution by being a little oversample or a lot. So there isn't anything anything wrong with matching your expected seeing with your platescale to arrive at an appropriate sampling.

    However, I find that if your stars are less than 6 pixels FWHM... it is OK.-  then I just do not acquire binned data at the telescope. I can always bin the data in post processing if I want to. I like to be *optimistic*... what happens if you are taking binned data on some nights of better than average or expected seeing? I would prefer to be in a state of preparedness at all times. Many people would choose to set and forget. While there is much talk about correctness... I am of the mind to capture the "bestness" if possible.

    Regarding Scott's point- this "theory" is fine if there is truly a downside to oversampling in terms if noise. But I don't think this is as much of an issue any more. In addition... people are always complaining about their stars "saturating" too quickly...guess what oversampling gets you? Nicer stars! The argument today in general is that "L" really isn't necessary either. If you collect RGB...and then combine the best RGB (or all of them weighted) to make a synthetic L... this does the same job as acquiring L+RGB at a telescope.

    Those are my thoughts.

    -the Blockhead
  • Thank you for that. It does all seem very logical and I suppose the main reason binning at the telescope is still done could well be old habits die hard. As a relative newcomer (about 4 years) to astrophotography I of course have taken a lot of advice from more experienced and well regarded people in the field but I am always eager to learn and will take this on board. Once again thank you and thank you for the Fundamentals I am learning so so much from and gaining a lot of enjoyment from the results.

    Ron 
  • Sorry, I didn’t mean to say that I shoot Lum’s during more
    optimal seeing conditions for better sampling, I do it that way to improve (in
    theory...) detail resolution. In real numbers, my best seeing gives me around 2.5”
    fwhm’s and when average, 3.25-3.5”. So, if I shoot Lum’s when it’s best, and
    r,g,& b the rest of the time, I may worsen the detail resolution of the RGB
    a bit, but I’ll have a Luminance with better detail resolution than an RGB shot
    on all nights. Best of a bad situation sort of thing.....

    Cheers,

    Scott

Sign In or Register to comment.