Subframe Selector Weighting Part 4

Hi Adam,

Much ike the prior person who posted, I also have avoided BPP (and the newer weighted BPP) and don't mind going through the individual steps manually (besides the feeling of control, it also serves as a teaching tool by reinforcing for me what is actually happening to the images one step at a time).  I have been using SFS, using a formula like the one you showed for WBPP, but of course using SNRWeight rather than SNR, since that is what is available in SFS:

(a*(1-(FWHM-FWHMMin)/(FWHMMax-FWHMMin))
+ b*(1-(Eccentricity-EccentricityMin)/(EccentricityMax-EccentricityMin)) + c*(SNRWeight-SNRWeightMin)/(SNRWeightMax-SNRWeightMin))+P

If some of one's images have the issue you highlighted in Part 4 (ie, the SNR weight being erroneous due to imaging lower in the sky or thin clouds increasing the brightness of the image), is it possible to incorporate your formula for an "improved" SNRWeight based on Stars (ie, 1/(Math.abs(Math.min(StarsSigma+1,0))+1) * SNRWeight) into the above formula (replacing the SNRWeight portion) for a better result?  Could I just replace each term of SNRWeight with the "fix" formula?  Or would that mess up the syntax?  If it is possible, could you provide the syntax of the new formula?  Hope that makes sense.

On another note, an idea I have read elsewhere is to vary the constants a, b, c, and P in the SFS formula based on the type of DSO; eg, 5, 10, 20, and 65 respectively for nebulae; 20, 15, 25, 40 for galaxies; and 35, 35, 20, 10 for star clusters.  Do you have any opinion on the merits of such an approach?

Thanks.  Your videos are endlessly informative.

Ed

Comments

  • Hi Edward,

    As I said in the video... I really haven't been deep diving into using SFS consistently. That being said... everyone has opinions and I am no exception! First regarding the formula, yes, if you wanted to incorporate my scheme to address the SNRWeight bad behavior that I noted (it might not be true for you.. you should really look and see!).... then indeed you need to multiply my modifier with the normalized SNRWeight. What a long expression this is going to be ...so just the SNRWeight part would be:

    c*(1/(Math.abs(Math.min(StarsSigma+1,0))+1))*(SNRWeight-SNRWeightMin)/(SNRWeightMax-SNRWeightMin))+P

    My modifier is just a fraction... so it should be OK just to multiply the normalized SNRWeight term by my silly thing. Again... I cannot guarantee it is going to work..but it would be an interesting experiment! 

    As for the  relative strengths of FWHM, Eccentricity and SNR(Weight) ... First off... I am not a fan of Eccentricity...I think this is virtually a hogwash term. If all the stars are oblong... this term doesn't do anything for you... if a small fraction of your data as oblong stars... REJECTION will take care of things. So this leaves FWHM and SNR terms. I can see the logic of favoring FWHM for very bright objects comprised mostly of stars or very very bright details. So a Globular Cluster...and the Orion nebula. I think these objects are more the exception..so I would favor the SNR term in general. I don't have a feel for what coefficient to use... because the "P" determines the overall strength. But just between the two terms...I would go with 80% SNR-like terms of 20% FWHM. But this is me. I say...if you can't see the object you can't process it...and the resolution will not matter. A sharp grainy/noisey image is useless. 

    -the Blockhead
  • I watched the SubframeSelector videos last night and this morning and really enjoyed them.

    I've always thought that the "conventional wisdom" about using FWHM, Eccentricity and SNRWeight was a bit different from what I wanted.  I really like the idea of using noise as the primary factor, and I am appreciative of the time that you spent working with various expressions.

    I have a couple of thoughts that I figured that I'd point out.

    You are using the star count as a way to determine the validity of the SNRWeight.  I'm not sure if you noticed, or if my data is not typical, but I've found the Median value to track the noise.  When switch between the Noise and Median graphs, they are always very similar.  One of my approval steps has been to show a graph of the Median and delete any subs where the value is significantly higher than the MedianMedian.  I've not been using SNRWeight at all.

    Also, after watching part 4, I wondered how to create a weight expression where I could weight all of the subs where Noise is at or below the NoiseMedian as 1, and lower the weight of subs with higher than NoiseMedian, with the minimum weight being 0.5.

    To do this, I wanted to use an "if" statement to treat the result differently, based on whether the Noise value is above or below NoiseMedian.  I was not able to pass generic JavaScript through, to use an iif() function, but I was able to find a workaround.

    You can use a different "if" syntax and it will work.  Specifically, you can use the form of "condition ? do-this-if-true : do-this-if-false".  Using that syntax, I was able to get the weighting that I wanted with the following expression:

    Noise <= NoiseMedian ? 1 : 1-(Noise-NoiseMin)/(NoiseMax-NoiseMin)*.5

    I would also be interested in a lesson on the importance of weighting.  I've long pursued a strategy of simply deleting any subs that don't meet my quality standards, but I end up throwing away more data than I like.  I don't think that I've ever seen a quantitative discussion on the effects of weighting on the integrated result.

    Thanks for a great presentation.
    -Wade
  • Geez...the entire reason I made that stupid formula was because I could not find a conditional statement in the Math methods. I don't know Javascript... so apparently you found the answer. That would have simplified my expression..but also would have made it too Javascripty I think. 

    Concerning the noise... median and noise will likely track with sky brightness in the sense that sky shot noise will rise with a rise in Median. Right? But unfortunately... under truly dark skies... I think the noise does not behavior as a good property to track... here a rise in Noise is good with good signal. Something like that. 

    To be honest...I usually combine just about everything I have less the frames I threw out with a pass of blinking. I wish noise evaluation was better... at the end of the day...if I have enough frames the average (with rejection)  wins out over any significant benefits of optimizing the weighting. 

    -the Blockhead
  • Hi, I have read your comments and have learned a lot. Thanks!

    I like to integrate the Star count.
    I think the Star count could show if the SNR is OK. If I have a strong drop in the Star count, but the SNR is high like before, then is something wrong (thin cloud?) with the measurement of the SNR and I have to check the images visually again.

    I am testing now this approach:

    1. Step 
    Approval with FWHM
    < 5.5 && Eccentricity <0.7 && Stars > ...350 <... (the limits are 2 sigma) 
    (5.5., 0.7 and 350 are variables and a function of the quality of the frames)

    2. Step
    (50*(1-(SNRWeight-SNRWeightMin)/(SNRWeightMax-SNRWeightMin))+10*(FHWM-FWHMMin)/(FWHMMax-FWHMMin))+20*(Stars-StarsMin)/(StarsMax-StarsMin))+20

    What do you think about this approach and are there any news or experiences for this stuff?
    Thanks Ed

Sign In or Register to comment.