I have observed that also. I have not investigated where the pedestal comes from- I usually just readjust the black level to bright things back if I need to adjust an image in PS. I will need to look in detail about this later.
But again, I have only observed a pedestal... not a whole image multiplicative scaling... so I haven't gone crazy about the issue.
I don't recall if this was discussed, but I would love to see a video or two of you talking about how to get some good data in the first place. Secondly what are some things we should be looking for in those data sets in order to make a determination of saying what's good or bad about it.
I realize everyone is going to have different equipment, CCD, CMOS, DSLR, but surely there are some fundamental things that set "good data" apart from "bad data", and how to recognize that. (This could go into some of the Pi scripts and processes more in-depth besides your typical Blink and Subframe selector (video's that you've already done) but maybe some of the more advanced noise evaluation things and what they are telling us about our data?
I wish some lessons were available about narrowband processing. Perhaps starting with the basics of what narrowband is best used for, the differences between processing LRGB and narrowband, when to use the different pallets (SHO, HOO, etc.) on which objects, using pixel math with narrowband data, examples of processed narrowband images. I'm sure you have many other ideas on the subject and I'm looking forward to your future videos. Thank you.
1. replacing NB stars entire field with RGB stars (some remove all and replace, some use mask and replace and some mask and replace but replace in the a* and b* channels of L-a-b..very confused here!
2. more noise reduction, particularly linear
3. I would love to hear how you approach setting up exposure length for your targets..I realize this depends on many factors that vary from person to person but I still struggle with this..even though I search astrobin each time for a target to get an idea of what others have used
1. NB stuff will probably have to wait. It isn't really in my wheelhouse at the moment. However, I do spell out a technique a number of times in my tutorials that is good for star replacement... and *no one* does it. Perhaps there is a drawback to it...
2. OK... so this is something that is perhaps driven by stylistic/philosophic considerations. I prefer to *not* do much noise reduction on the images in the linear state. There is too much non-linear to follow- and there isn't a way to know how much noise reduction to do in the beginning (whether too much or too little) that will have consequences later on. Instead I really fell in love with MURE denoise. This is not a smoothing operation... it is a despeckle process that works based on noise modeling. *This* I am really comfortable to use. Unfortunately this doesn't help OSC work...
So I am in the group that does much of the noise reduction at the end with MLT/MMT and TGVDenoise.
That said...I heard your request and if there is a really good one I can think of... I will include it.
3. There are two parts of acquisition considerations. One is the exposure length. The other is the statistical one. So let me start with the second, easier one. Regardless of the exposure time- it is important to make as many measurements as you can (within reason). A good rule of thumb is at least 15 images. This will allow you to take advantage of all of PI's rejection and noise evaluation/weighting schemes.
In terms of a sub-exposure length- this is a technical calculation that includes the noise characteristics of your camera, the brightness of your sky, the filter(s) you are looking through AND your optomechanical performance. Under a dark sky with a good mount- filtered exposures with a read noise limited situation can be quite long. Perhaps 15-20 minutes in length. And now you know my minimum total integration time... 15 minutes times 15 measurements (exposures). So 3-4 hours per filter..and narrowband is more of course (probably triple).
so assume you will take whatever # of nights necessary to acquire required # of subs
if you are imaging a medium bright target with Luminance filter in dark sky spot..and assume tracking great no matter exposure length choice and Sony chip with low read/dark noise:
- is there a downside from image processing /quality standpoint from a longer exposure (20 min vs 10 min) re: bloated stars etc?
if you have a brighter target..let's say a glob cluster....are you shooting yourself in the foot by trying longer 10 min exposure re: star resolution and not blowing out stars..vs sacrificing detail with shorter exposures? ultimately question is can I take the exposure length I desire and worry during processing re: bloated saturated stars....vs never letting exposure length get long enough to set me up to not be able to compensate during processing for bloated or over-saturated stars?
see my dilemma? If I have the capability of doing longer tracked exposures and getting enough exposures eventually, is there still a downside to longer exposures for med bright targets?
The longer exposure-bloating thing is a non-starter. The "bloat" in terms of seeing occurs in just a few seconds due to the seeing. A longer exposure of *well guided* images should not increase the FWHM significantly (assuming seeing is consistent, whatever the degree). What you will see is that for longer exposures you will see more of the PSF (faint wings) of stars. This is not bloat. The FWHM is the same (approximately) in both cases. So..... I am not a bloat fan.
Saturation is an issue for stars. But for me it is only an issue if the subject is the stars themselves (like a globular cluster). For most deep sky objects it is almost impossible to saturate them even with a large telescope. With the 0.8m I only had to worry about M42, M31, M77, and globular clusters. That is pretty much it. Most objects benefit from improving the S/N of the faint signal- so this is a priority and if the field stars become saturated..that is OK. So the point here is that I do not worry about something that you might. When the subject is bright (stars or other object) there are two basic ways to deal with it during acquisition. One is to simply acquire all data filtered at full resolution. No need for a separate luminance image. Luminance data is only necessary when trying to color faint luminance data and you do not want to spend the extra time to get the S/N so you bin the color data. By looking through a filter, you are cutting down on 1/3 or more of the light from a subject... so it offers enough control to take images of the brightest things.
PNe are a great example.
IF the subject is crazy bright (like a globular) and not even the filtered images allow for long exposures without saturation of the subject (the stars)... then the best method is to acquire short and long data and create an HDRComposition. It is simple enough since the short exposures are trivial to acquire... the main issue is on the processing side.
So... I do not see a strong case of your dilemma. I would expose for as long as you can as long as you meet the minimum number of measurements for the statistical part ( say 15 exposures). The downside to longer exposures is that you are investing more of your image quality per frame. This can be problematic if there are issues with the data- since you may not have enough to throw out frames.
— “NB stuff will probably have to wait. It isn't really in my wheelhouse at the moment. However, I do spell out a technique a number of times in my tutorials that is good for star replacement... and *no one* does it. Perhaps there is a drawback to it... “
I was wasn’t able to locate this, at least as a separate subject. If you get a chance, could you point the video(s) where you present this technique?
Comments
Adam,
I wish some lessons were available about narrowband processing. Perhaps starting with the basics of what narrowband is best used for, the differences between processing LRGB and narrowband, when to use the different pallets (SHO, HOO, etc.) on which objects, using pixel math with narrowband data, examples of processed narrowband images. I'm sure you have many other ideas on the subject and I'm looking forward to your future videos. Thank you.
— “NB stuff will probably have to wait. It isn't really in my wheelhouse at the moment. However, I do spell out a technique a number of times in my tutorials that is good for star replacement... and *no one* does it. Perhaps there is a drawback to it... “
I was wasn’t able to locate this, at least as a separate subject. If you get a chance, could you point the video(s) where you present this technique?
Thanks very much!
Craig