Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

What is state of the art for optimal sub-frame exposures ?

astrophotography CMOS dslr imaging
  • Please log in to reply
8 replies to this topic

#1 nathanm

nathanm

    Explorer 1

  • -----
  • topic starter
  • Posts: 57
  • Joined: 28 Jun 2020

Posted 10 July 2020 - 12:55 PM

A key method in astrophotography is taking multiple exposures (sub-frames or subs) and stacking them to improve SNR.   Which begs the question - given a desired total exposure T, how do you divide that up into multiple exposures.   

 

I should state up from that I am mostly interested in recent DSLR, but of course this does apply to dedicated astro cameras too.  I use astro modified Canon 5D mark IV, and QHY 128c, which although cooled, is a Sony DSLR chip with one shot color.    These are both low read noise cameras.  

 

In addition to optimal exposure time, there is the related question of optimal ISO setting.

 

I recently got interested in this issue.   I somewhat naively assumed this would have been figured and settled long ago.  

 

Some aspects of exposure constraints are fairly obvious.  If you have a fixed mount, and you don't want star trails, then your exposure time is limited by apparent motion of the stars.   You can use of the  "hundred rules" - i.e. divide 600 by focal length.   Or you can calculate the transit time of a star image across a pixel.

 

Another obvious aspect is that if there is light pollution you must not expose until the image is washed out white, or you will lose detail.   

 

More generally, there are different sources of noise - read noise, photon noise, dark current noise, and potentially others - that one has to worry about.

 

This issue has been discussed a lot in this forum.   Here is an article from 2006  https://www.cloudyni...-in-dslrs-r1543 by Samir Kharusi.

 

This 2007 article https://www.cloudyni...-exposure-r1571  by Chuck Anstey references an earlier study by John C. Smith, but the link to the Smith paper is dead, and I haven't found the original.     Anstey gives a formula that sub exposure time S = lambda* Sqrt[Ttotal]/(2 Etotal)

 

Where lambda = 15,   Ttotal is total exposure time in minutes, Etotal is total noise.

 

Roger Clark's website has this analysis of sub-frame exposure https://clarkvision....y.and.exposure/, which appears to date from 2016.    Clark gives some spreadsheets, but his bottom line recommendation is very simple - choose and exposure such that the histogram on the back of the camera (i.e. after gamma correction) is between 1/4 and 1/3 of the scale, which it states is roughly 3% of the RAW reported ADU.   This assumes "modern" cameras with 2 to 4 electrons of read noise.

 

This analysis was also presented in a 2016 thread on dpreview http://www.dpreview....m-post-57610694.    The thread starts off being technical and then devolves into a bitter argument, chiefly between Clark and Jon Rista.    The argument seems mostly to focus on issues that were tangential to the sub-exposure issue.   So it is not actually very useful - to me anyway - in understanding where they differ on the issue of optimal sub frame exposures.  

 

So, it was a raging debate in 2016, at least for some.   More recent accounts, like this one https://www.amateura...which-is-better  lay out issues but don't say very much.

 

I found this 2020 thread https://www.cloudyni...e#entry10302433 , which recommends a sort-of-formula in words:

 

Shoot a light and a bias.  Subtract (either the average value or the obvious skyfog peak, doesn't matter), get the corrected analog to digital units.  Using camera data, convert to electrons.  Get the read noise, which will be in electrons.  Square it.  You want the first number to be between 5-10X the second.

This is references to The Astrophotograpy Manual by Chris Woodhouse.

 

Another post references this article series on Cloudy Nights  https://www.cloudyni...ur-camera-r1929

 

Finally, one of the posts on that thread essentially repeats Clark's recommendation of histogram peak at about 1/3 of range.

 

So, is that the state of the art?  

 

 

 



#2 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 20,445
  • Joined: 27 Oct 2014

Posted 10 July 2020 - 01:19 PM

I'm responsible for the Woodhouse citation and 5-10X the read noise squared recommendation.  It's what I generally use, with some variation due to target.

 

This is a matter where reasonable people can disagree reasonably.  PixInsight has a subexposure calculator, there are 3 options, which give significantly different results, and I can't really say which is "best".

 

But 1/3 over, while it may be OK, doesn't allow for the variability of read noise between, say, an old Canon and a new Nikon.  So it's definitely less accurate.  Not in any way your requested "state of the art".  Read noise is a _very_  big deal (note that it's squared in the 5-10X read noise squared thing), and why many people overexpose with the new very low read noise CMOS astro cameras.  Compared to DSLRs they prefer more short subs.

 

Note that in your 2-4 electrons read noise example, the difference in "optimal" subexposure would be a factor of 4.  That's like 1 minute for the low read noise example versus 4 minutes for the high read noise example.  A _big_ difference.

 

It's simply not possible to give you a good answer as to the "state of the art" in a short post here, the issue is just too complex for that.  I can only, once again, point you to Woodhouse, a very much "state of the art" expert, who gives several pages of explanation (warning - math ahead <smile> ) that can't be shrunk down effectively.  Among other things he talks about exceptions to the general 5-10X read noise squared rule of thumb, and how to deal with them.

 

There is no simpler answer that's remotely as good.  Here's a 4 page thread with some good information, which shows some of the complexity.

 

https://www.cloudyni...yfog-histogram/

 

DSO AP is rarely simple.  1/3 may be "good enough", but it's not very sophisticated.

 

And we haven't even gotten into low ISO and longer subexposures versus higher ISO and shorter subexposures.  <smile>  Or how the optimal subexposure is different for different targets (which Woodhouse discusses).

 

Minor point.  I'm personally not fond of Antsey's approach, but I believe it's one of the 3 possibilities in PixInsight.


Edited by bobzeq25, 10 July 2020 - 01:36 PM.


#3 nathanm

nathanm

    Explorer 1

  • -----
  • topic starter
  • Posts: 57
  • Joined: 28 Jun 2020

Posted 10 July 2020 - 07:20 PM

Ok, to Woodhouse it is then....

 

One factor for me is what I call "unlucky imaging".   Lucky imaging is where you overshoot, and cull out all but the best shots, and is really more of a planetary technique.

 

Unlucky imaging is me hoping to use all of the shots, but knowing that there is a chance that I will get an airplane fly through, or a satellite glint, or my friend accidentally sweeping a flashlight beam across the lens.   In those cases it is nice to be able to throw away an image without worrying about a huge reshoot because I have N shots and could make due with N-1.

 

Nathan 



#4 Alen K

Alen K

    Surveyor 1

  • *****
  • Posts: 1,504
  • Joined: 25 Nov 2009

Posted 10 July 2020 - 08:10 PM

Unlucky imaging is me hoping to use all of the shots, but knowing that there is a chance that I will get an airplane fly through, or a satellite glint, or my friend accidentally sweeping a flashlight beam across the lens.   In those cases it is nice to be able to throw away an image without worrying about a huge reshoot because I have N shots and could make due with N-1.

If you dither and use kappa-sigma rejection during stacking, you pretty much don't need to worry about satellites or airplane lights. Only the affected pixels are omitted from the stack, not the entire frame. Fiends with flashlights, on the other hand...



#5 nathanm

nathanm

    Explorer 1

  • -----
  • topic starter
  • Posts: 57
  • Joined: 28 Jun 2020

Posted 10 July 2020 - 11:29 PM

Ah yes, but stacking requires there being enough other images to stack, so I never want to have one or a small number of images.    And yes friends with flashlight or other calamities are a non-zero occurance



#6 Alen K

Alen K

    Surveyor 1

  • *****
  • Posts: 1,504
  • Joined: 25 Nov 2009

Posted 10 July 2020 - 11:34 PM

Ah yes, but stacking requires there being enough other images to stack, so I never want to have one or a small number of images.  

Yes, you need a few frames. Some programs (or just one program, I haven't checked) can do it with as few as eight frames, others need more. 



#7 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 20,445
  • Joined: 27 Oct 2014

Posted 10 July 2020 - 11:59 PM

Ok, to Woodhouse it is then....

 

One factor for me is what I call "unlucky imaging".   Lucky imaging is where you overshoot, and cull out all but the best shots, and is really more of a planetary technique.

 

Unlucky imaging is me hoping to use all of the shots, but knowing that there is a chance that I will get an airplane fly through, or a satellite glint, or my friend accidentally sweeping a flashlight beam across the lens.   In those cases it is nice to be able to throw away an image without worrying about a huge reshoot because I have N shots and could make due with N-1.

 

Nathan 

Stacking using various data outlier rejection methods takes out airplanes and satellites.   But I still routinely dump 10+% of my subs for various problems, not related to artifacts like those.  I think most people do that.



#8 whwang

whwang

    Gemini

  • *****
  • Posts: 3,175
  • Joined: 20 Mar 2013

Posted 11 July 2020 - 04:31 AM

The bottom line is that you want the photon noise from the sky background to be a few times higher than read noise in each sub.  What "a few" means depends on each person, could be as small as 2 or as high as 10.  

 

There are formulas, but I am not sure many people (or any people) would like to use it.


  • sharkmelley and ChristopherBeere like this

#9 Huangdi

Huangdi

    Viking 1

  • *****
  • Posts: 646
  • Joined: 24 Jul 2019

Posted 11 July 2020 - 07:57 AM

I think using formulae such as this would be more work than simply going out and trying a few different things.

 

To me there are a bunch of factors to consider before choosing the right exposure time

1. Light Pollution -> This is going to be your biggest limiting factor if you don't have dark skies. If you do have dark skies, this is of no concern.

 

2. ISO performance of your camera. If you have dark skies, you can start with your ISO as a reference, this is what I do.

 

3. Imaging speed. Are you shooting at F3 or F10? I have searched for formulae that specifically determine the speed of your optical system, but honestly it's more work than it's worth it.

 

4. Goals. Do you want to pick up IFN or shoot a star cluster? These will require different exposure times as well. Blowing out a few more stars to catch the faint is fine. But then again if you have to expose that long, your system probably is too slow to get a good amount of faint stuff anyway.

 

To sum it up, if you're light pollution limited then use the lowest ISO without completely overwhelming your image with read noise, if you're exposure limit clip stars to your liking -> depending on what you're shooting.




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics





Also tagged with one or more of these keywords: astrophotography, CMOS, dslr, imaging



Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics