When I combine an LRGB in pixinsight following the typical workflow: preprocessed, DBE RGB and lumi, combination of RGB channels, correction of the background and color calibration, denoise, stretched and finally combined LRGB and adjustment of curves, I have the problem that most of the small and medium stars appear multicolored, it is say the different pixels of these stars have various colors that make them unnatural. I present a collage with an enlarged area of the R, G, B channels, a luminance and the final result of the entire process where you can see what I tell you about the multicolored stars. I do not understand why it happens to me, although it is true that the registered masters of each channel the illuminated pixels of the stars do not coincide with each other because I understand the tracking and autoguiding (seeing effect also) during the capture of the images. I hope you can give me some tips to avoid this and to be able to obtain stars with more uniform and realistic colors.
Unfortunately this kind of issue has a number of causes.You are correct that differences between color channels will show up in the halos (edges) of stars. However, this really tends to average out with a significant number of frames. Consistent chromatic issues that remain might be due to a number of things:
1. Optical aberrations. Is there a pattern of the colors across the field? Is there there a radial dependence? Perhaps one quadrant is different than another?
2. Is the registration of the images really precise? Small high order terms that need compensation (distortion) will cause this issue. It could be they are compensated for... OR that the registration of the data is applying a distortion where none is needed.
3. Real differences in the shape of stars between channels.
4. Atmospheric dispersion. If you were looking at high airmass- the dispersion of colors can cause what you are seeing.Registration does not necessarily completely fix this.
So, some things are opto-mechanical- others are processing effects. Even the choice of interpolation for registration might make these effects worse/better.
The effect you are showing in your data does appear small (whatever the cause). One of the benefits of LRGB- is that the RGB image can be blurred a bit without any loss of detail- because this is controlled by the luminance image. You might consider any number of small smoothing/noise reducing processes to be applied to the RGB. Even sing MLT and removing the first (and perhaps 2nd) layer/scales might do some good.
Ultimately a bit more detective work would be necessary to figure out the cause.
My telescope is a takahashi FSQ 106, so the optics are excellent and I don't think the problem comes from there.
I think it is most likely due to the bad seeing in my area, which makes the profiles of the stars very different between images and filters.
I have tried to use MLT suppressing layer 1 and even 2, and it certainly reduces the effect, although another problem arises and it is that the stars get very fat and when combining the luminance colored circles appear around the stars if I stretch the background a lot .
I would like you to see my latest work, where the biggest stars have a very yellow tone on their right side. Even using channelMatch I can't avoid this effect.
Yes... you might consider using MaskedStretch and HistogramTransformation instead of Arcsin..
I think you found your stars are somewhat less objectionable. It would be a good experiment. In the latest video I posted on a data set I am processing in Horizons (NGC 3614) I show the difference between the two methods.
Comments