Hi,
When I first started astrophotography, I believed longer exposure times would result in deeper images. As in more nuance in and faint nebulosity etc.
After thinking about it more deeply and reviewing an excellent talk by Dr. Robin Glover that circulates on YouTube, I realized that (for example) 1 hour of of exposures wil result in the same number of photons captured and converted in electrons no matter if you take 1 exposure of 60minutes or 60 exposures of 1 minute. (Of course there are issues like the increased read-noise, full-well capacity, risks of satellite trails etcetera to keep in mind. But I'm talking mainly about image detail here.)
With that information I calculated my ideal exposure length is 66secs for OSC images and 1800 secs for Ha and OII, consider in my camera and my sky conditions. I create images with +/- 20-25 hours of integration time this way.
Recently, I've heard more people mention that having longer exposures will increase faint information like faint stars and nebulosity. And I started to second guess myself.
Is there someone here that can help me out? Will increasing my exposure length improve my images (provided the total integration tim stays the same)
Thank you in advance.
Comments
There are more things in heaven and earth, Dr. Glover, than are dreamt of in your philosophy.
That helps a lot! With my wide-field setup I can do both long and short exposures. I regularly do 1200 seconds for my H-alfa. Obviously there are some benefits to having shorter exposures, so all things being equal I would choose shorter over longer. But it seems, all things are not equal and I'm hurting my images. I'll start a 300 second project tonight and will test if I like it better. Although it would be very subjective.
For my new RC8, it will be a while before I master the long exposures. But now I have a good reason to try.
Thanks again,
Clear skies
I find many images are overexposed. The stars are oversaturated to the point they lose all color.
My suspicion is many overexposures result from a single exposure time fits all approach, like 300 seconds. Then they turn to something like HDRMT to bail them out when they end up overexposing detail.
Expose the Orion Nebula at exposures of just 300 sec. and you will almost certainly blow out the Trapezium. I almost always take more than one set of exposures. Say, 15 or 30 sec for the trapezium, 180 sec for most of Orion, and 300 sec for the faintest details.
Another reason there is no such thing as an ideal exposure, exposure time can vary depending on intended output. 16-bit HDR monitor, 8-bit RGB monitor, Web. I use a Canon 1000 Pro. It has great D-Max for a printer. That varies with paper. It's not the same as a monitor. Especially a 16-bit HDR monitor.
People will say, their camera has 14-stops of Dynamic Range. ZWO has a pretty graph for both of my cameras that say 14 stops at Gain 100. That's from a test suite under controlled conditions. What a test suite says is a detectable difference between high and low is not what I as a photographer would consider a meaningful difference.