Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Calculating exposure time and gain from data

  • Please log in to reply
14 replies to this topic

#1 unimatrix0

unimatrix0

    Ranger 4

  • -----
  • topic starter
  • Posts: 344
  • Joined: 03 Jan 2021

Posted 06 May 2021 - 11:38 AM

 I understand there is a way to  get an idea where to set the offset by looking at the histogram of a bias or dark frame.  Now I am looking to find an answer to find the right set of exposure/ gain settings by looking at numbers and not shapes and visual clues.  I figured, that staring at the histogram or stretching the image can be very misleading by just eyeballing it and not consider the numbers that should go along with it. I know it's a rookie thing, coming from DSLRs, where most of the info people seem to use is just the graph's position on the scale and try to "eyeball" things and hope for the best. 

 

Here is what I'm really asking:   taking 1 single exposure - let's say gain 101/ offset 30 for asi533mc pro - 120 seconds and stretching the image, what information I need to be looking if the star shapes or the object i'm imaging isn't telling me the whole story?  I learned FWHM  can help me with focusing and that's what the autofocuser use to calculate optimal focus, but does it help with exposure time?

 

I know stars on images grow larger by more exposure time, but what's the starting point?

 

What size are they supposed to be under "optimal" amount exposure time?   Or is there another number I should be looking out for?

 

These are probably very beginner questions I realize and probably funny too, for people doing this since the beginning of time, but consider that I've been doing this for less than 6 months and out of those 6 , I've been only using CMOS cameras for 4 and there are limited number of clear skies and out of those clear sky nights, and not all available for scope time  (I have a wife and an 8yr old daughter). 

 

Also, various software auto-stretches the image at various ways and I think the curve (without actual numbers around it ~ like N.I.N.A's cartoonish histogram ) isn't the most precise information if you ask me. 

 

What' I've been doing is starting out with Sharpcap nearly every single time and try to get some info by turning on the histogram and do a measurement. Most of the time it tells me a very low offset settings, which I never follow, because it seems too low ( 3 or 5 etc) and I'm afraid I'm gonna be clipping data, so I keep that offset 30, regardless what sharpcap tells me. This is just coming from my limited amount of experience, because sometimes I just follow what looks right, not what's written down. That doesn't necessarily mean I'm correct though, I also realize that. 

 

But I do consider the exposure time Sharpcap tells me (interestingly it's never more than 52 seconds) , but I end up adding time on top of it, because the single subs I get via suggested exposure/gain/offset just look very faint and many times and the suggested number of subs to take is something I can't reach anyway, due to trees, clouds, guiding issues etc. 

 

It's the same issue with NINA's exposure calculator, which gives me obviously too low numbers (13 second exposures??) , despite my light pollution isn't that bad (bortle 5) and my scope is at the darkest possible location in my backyard with no floodlights shining on it and I trip over stuff if I don't bring a flashlight with me. 

 

Also to consider, while the measurement gives me the "best dynamic range" possible with my current setup and sky quality, that doesn't always translate into the best image (In my experience)  , especially if you won't have enough subs to stack.

 

I also put into consideration, that I probably trade the dynamic range for more noise (I guess?) , but at least I have something showing up once I stack the images. That means, that I deviate from the suggested settings by bumping up the exposure time, but I usually leave the gain alone, or actually lower it a bit, regardless what the unity gain or read noise charts telling me. 

Just yesterday I double checked my older subs and I do have a set of 300s, 160s, 120s, 60s, 30s of exposures made a while ago,  from the same camera and taken with similar settings and the same scope and while the measurement info may indicate, that my best settings should have been the 60seconds, the post processing was much easier and revealed a better stacked/adjusted image with the exposure taken with 120 seconds. 300 seconds also wasn't too bad, although star bloating was obvious, the stacked image looked acceptable, it just needs some cosmetic work via Photoshop. 

 

Ok so I'm rambling too long now, my question still about using information (in numbers) gained from taking a picture at a certain exposure and how is that helping me to figure out if I'm over exposing or there is room for longer exposures and I won't be "over filling" my photon capturing well of the camera sensor? 


Edited by unimatrix0, 06 May 2021 - 11:43 AM.


#2 P_Myers

P_Myers

    Ranger 4

  • -----
  • Posts: 293
  • Joined: 29 Feb 2016

Posted 06 May 2021 - 12:07 PM



https://www.cloudyni...steve bellavia

#3 fewayne

fewayne

    Apollo

  • *****
  • Posts: 1,341
  • Joined: 10 Sep 2017
  • Loc: Madison, WI, USA

Posted 06 May 2021 - 01:12 PM

TL;DR: Don't sweat the numbers. Good enough will be good enough; total integration time rules.

 

Offset: My advice is to minimize the (already intimidating) number of variables in play, pick a value that's "good enough", and only use that.

 

Evaluation: Dimness of single sub-exposures, by eye, is unlikely to be a useful metric. Again, what counts is integration time -- stacking can dig details out of data that looks like a dead loss to the eye.

 

Sub length: Grossly speaking, the relationship between sub length, number of subs, and total integration time is linear. So 1000x1" gives you the same number of photons detected as 1x1000". Of course, it's a little more complicated than that. Read noise biases us toward fewer, longer subs; tracking errors, airplanes, tripping over the tripod, and filling the photosite well all argue the other way. Perfect optimization is unnecessary, but as you ask, how to tell when good enough is good enough?

 

First off, avoid overexposure. Nothing can be done for maxed-out pixels. So examining a single unstretched sub quantitatively is helpful. Look for maximum values (invariably stars) and ensure that they're not blown out. Many astro processing packages do statistics for you, or you can just hover a cursor over bright-looking stars with other image processing tools.

 

Next, ensure that your subs are short enough to be within your equipment's capabilities.  Be merciless here: The answer to too short an exposure is "take more subs", the answer to trailed stars is "throw that exposure out".

Finally, if you like, you can run the numbers on the read noise from your sensor at the gain you're using and figure out how long each sub has to be  to swamp read noise. Unless you're doing narrowband, or you do tracking by mounting your camera to a bicycle wheel and pedaling very slowly, the answer is almost certainly "my subs are long enough already".

 

BTW I also counsel that you pick one or two gain/exposure combos and just use those routinely. A foolish tendency to over-optimize is the hobgoblin of new astrophotographers, said Emerson. (Or something like that.) For example, when I'm doing narrowband, I max out the gain and run the exposure at the most my rig will consistently allow -- five or ten minutes. LRGB, minimum gain and short exposures to avoid blowing out pixels. That way I only have to have a few master darks in my library, and processing is more or less consistent.

 

Hope this helps. It's really hard and complicated, so Astrophotographers of Very Little Brain such as myself have to simplify as hard as we can!


  • dswtan and Desertanimal like this

#4 fewayne

fewayne

    Apollo

  • *****
  • Posts: 1,341
  • Joined: 10 Sep 2017
  • Loc: Madison, WI, USA

Posted 06 May 2021 - 01:13 PM

(Did I really paraphrase Emerson and Pooh in the same post?)

 

I forgot to include a reference recommendation: Lodriguss, Bracken, and Phillips all explain this much better than I can. I'm a particular fan of Bracken's The Deep-Sky Imaging Primer, his discussion of how noise and sub-exposures work is excellent.


Edited by fewayne, 06 May 2021 - 01:23 PM.


#5 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 24,653
  • Joined: 27 Oct 2014

Posted 06 May 2021 - 01:24 PM

Your questions were in no way funny.  They're common questions.

 

Basics.  Really basic.  Important stuff.  After that, how to use numbers.  It's not trivial, and it's less important than the basics.

 

Don't expect to see much in one sub, that's why we take a lot of them.

 

Beginners often overexpose subs (because they want to see something).  That ruins both dynamic range and star color.

 

What counts is the total imaging time, how many total photons we capture.  How you divide it into subs is much less important.

 

The fact that you find it difficult to do adequate total imaging time (my rule of thumb - one hour minimum, 2 is better, 4 is good) doesn't make it less important.  You won't fix that with longer subs.  You won't make it _any_ better, at all.  Not at all.  Zip.

 

Look at this video.

 

https://www.youtube....h?v=3RH93UvP358

 

The chart after 50 minutes will give you an excellent starting point for subexposure.

 

But total imaging time is way more important.  What counts is how many photons you capture.  Period.  End of story.  Higher gain, longer subs, do not capture more photons.

 

If that doesn't work for you, look into electronically assisted astronomy.

 

Now, the less important stuff.  Doing this by the numbers.  Requires some research.  Much less important.

 

Find a program that will let you measure the average ADU of an unstretched light (I've used PixInsight and IRIS, there are others), corrected by subtracting the average ADU of a bias.

 

Using data from the camera manufacturer convert the ADU into electrons.

 

Also using data from the camera manufacturer, find the read noise in electrons, and square it.

 

The first number should be 5-10 X the second.

 

If you want to know why this works, and what to do about exceptional cases, this book has an excellent discussion.  Several pages, not easily summarized here.  The basic idea is that your images contain both read noise, and sky noise (larger in light pollution) and you want to try to compensate for each, while keeping good dynamic range and star color.  Lots of short subs, too much read noise.  Few long subs, bad dynamic range and star color.

 

https://www.amazon.c...h/dp/1138055360

 

The pretty pictures do not come easy.  <smile>  There's always something more to learn, part of the charm for many of us.


Edited by bobzeq25, 06 May 2021 - 02:06 PM.

  • Desertanimal likes this

#6 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 24,653
  • Joined: 27 Oct 2014

Posted 06 May 2021 - 02:03 PM

An extreme example (F2, Bortle 7) of total imaging time being more important than subexposure.  A concept many beginners find difficult.

 

At that F number and light pollution level, the exposure that got (corrected) light ADU to 10 X read noise squared was 10 seconds.  Needless to say, you didn't see much in a 10 second sub.

 

But do 10 second subs for a little less than 2 hours total imaging time, and I got this.  Photons.  <smile>

 

Pleadies 2019 V3_smaller.jpg

 

 

 

 

 

 

 

 


  • 06AwzIyI and unimatrix0 like this

#7 dswtan

dswtan

    Viking 1

  • *****
  • Posts: 622
  • Joined: 29 Oct 2006
  • Loc: Morgan Hill, CA

Posted 06 May 2021 - 02:16 PM

Totally agree with fewayne. I use unity gain, 60 sec, offset 40 (or whatever, just be consistent), 0C. That is all I need to take pictures.

 

The quest for "optimal" in this hobby is a potential path if that's what drives the OP, but if the hobby is about taking photos, then you don't need to optimize to anything like the extent described. It's a "not seeing the wood for the trees" situation in Beginning Deep Sky Imaging. It's a discussion that is potentially relevant in Experienced Deep Sky Imaging. 

 

I personally found this helpful, and shorter than getting into a whole book about it (again, unless that's the interest you want to pursue rather than just take pictures). 

https://astronomy-im...com-driver.html



#8 unimatrix0

unimatrix0

    Ranger 4

  • -----
  • topic starter
  • Posts: 344
  • Joined: 03 Jan 2021

Posted 06 May 2021 - 02:45 PM

Thanks for the responses, I did read into that link regarding the testing of the 533 mc pro and exposure settings.  I also watched that youtube video from the Sharpcap creator. 

 

Why I don't go with the low offset though, even though I'm not clipping, I got a serious banding problem that clears right up above offset 30. 

 

I think what confuses me is going through Astrobin and other sites is the data I see what people are doing to get the image. I see very long exposures and I'm not sure how they do it, if they are only using an IR-cut filter and they got a worse sky than I do. I understand narrowband filter exposures require  A LOT of time and I'm not talking about mono cmos, just one shot color. 

  Well, according to Sharpcap's creator Dr Robin Glover, a lot of people - to put it frankly- wasting a lot of time, because after a certain amount of gain/exposure, the benefits of doing 2x as long exposures are very low amount extra data by doing so. 


Edited by unimatrix0, 06 May 2021 - 02:46 PM.


#9 dswtan

dswtan

    Viking 1

  • *****
  • Posts: 622
  • Joined: 29 Oct 2006
  • Loc: Morgan Hill, CA

Posted 06 May 2021 - 03:34 PM

  Well, according to Sharpcap's creator Dr Robin Glover, a lot of people - to put it frankly- wasting a lot of time, because after a certain amount of gain/exposure, the benefits of doing 2x as long exposures are very low amount extra data by doing so. 

Assuming the data on Astrobin is accurate (not necessarily - it’s manually entered), “wasting time” is arguable because those that take 5-min exposures vs. 1-min are still trying to achieve the same total integration, like Bob mentioned. Integration is (almost) everything. 

 

But us 1-minute folks (vs say 5-min) have 5x more files and storage to wade through — is that wasting time? Personally it’s a good tradeoff for me because I have mostly have had shallow well cameras, brightish skies, and vibration issues. I also don’t care about the storage impact. And PI’s Blink is pretty helpful. So 1-min works for me and I never see myself taking 5-min (or whatever), with today’s technologies and my current rigs.

 

Bottom line: optimal is local and ultimately personal. Pick a set of numbers and see if that works for what you want to achieve; iterate from there by your own experience. And enjoy! :-)



#10 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 24,653
  • Joined: 27 Oct 2014

Posted 06 May 2021 - 03:41 PM

Thanks for the responses, I did read into that link regarding the testing of the 533 mc pro and exposure settings.  I also watched that youtube video from the Sharpcap creator. 

 

Why I don't go with the low offset though, even though I'm not clipping, I got a serious banding problem that clears right up above offset 30. 

 

I think what confuses me is going through Astrobin and other sites is the data I see what people are doing to get the image. I see very long exposures and I'm not sure how they do it, if they are only using an IR-cut filter and they got a worse sky than I do. I understand narrowband filter exposures require  A LOT of time and I'm not talking about mono cmos, just one shot color. 

  Well, according to Sharpcap's creator Dr Robin Glover, a lot of people - to put it frankly- wasting a lot of time, because after a certain amount of gain/exposure, the benefits of doing 2x as long exposures are very low amount extra data by doing so. 

Offset (like subexposure) is not a big deal.  40 or 50 is fine.

 

Hear me now, and believe me later.  <smile>

 

What subexposure other people use has _no_ relevance to you.  (very common beginner mistake to think it does)  It depends on many things.  You need a site specific determination of what's correct for you.  I'm certainly not trying to say you should take 10 second subs.

 

The virtue of (corrected) lights ADU equals 5-10 X read noise squared is that it includes all the important site specific factors.

 

Note that too long subexposures don't waste time.  They produce bad data.

 

Note also that 5-10 X means there's a wide range of subexposures that are "good".


Edited by bobzeq25, 06 May 2021 - 03:44 PM.

  • dswtan likes this

#11 06AwzIyI

06AwzIyI

    Vostok 1

  • *****
  • Posts: 194
  • Joined: 14 Jul 2020

Posted 06 May 2021 - 03:44 PM

An extreme example (F2, Bortle 7) of total imaging time being more important than subexposure.  A concept many beginners find difficult.
 
At that F number and light pollution level, the exposure that got (corrected) light ADU to 10 X read noise squared was 10 seconds.  Needless to say, you didn't see much in a 10 second sub.
 
But do 10 second subs for a little less than 2 hours total imaging time, and I got this.  Photons.  <smile>
 
attachicon.gifPleadies 2019 V3_smaller.jpg


Nice, was this with 0 or unity gain?

#12 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 24,653
  • Joined: 27 Oct 2014

Posted 06 May 2021 - 04:21 PM

Nice, was this with 0 or unity gain?

Zero.  Unity would have required even shorter subs, and 650 subs was enough.  <smile>


  • 06AwzIyI likes this

#13 06AwzIyI

06AwzIyI

    Vostok 1

  • *****
  • Posts: 194
  • Joined: 14 Jul 2020

Posted 06 May 2021 - 05:22 PM

Zero.  Unity would have required even shorter subs, and 650 subs was enough.  <smile>


Have you tried unity and if so, does it result in a better image? The reason I ask is, I have a relatively fast scope (f/4) and would like to increase exposure time to cut down the processing time. I was thinking about going zero gain but I don't want to sacrifice image quality...

#14 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 24,653
  • Joined: 27 Oct 2014

Posted 06 May 2021 - 07:03 PM

Have you tried unity and if so, does it result in a better image? The reason I ask is, I have a relatively fast scope (f/4) and would like to increase exposure time to cut down the processing time. I was thinking about going zero gain but I don't want to sacrifice image quality...

I wouldn't use unity for broadband, F2 and Bortle 7, I want the longest subs I can possibly get.  Zero gain.  The quality is fine for me, it doesn't seem to be camera limited.

 

I've used unity for narrowband, slower scopes...  I wouldn't say there's a noticeable gain in quality, but these are different images, different targets, different processing....  No true comparison is possible.



#15 unimatrix0

unimatrix0

    Ranger 4

  • -----
  • topic starter
  • Posts: 344
  • Joined: 03 Jan 2021

Posted 06 May 2021 - 08:38 PM

 Personally it’s a good tradeoff for me because I have mostly have had shallow well cameras, brightish skies, and vibration issues. I also don’t care about the storage impact. And PI’s Blink is pretty helpful. So 1-min works for me and I never see myself taking 5-min (or whatever), with today’s technologies and my current rigs.

 

 

My other camera has a small well (QHY183C) and it does take excellent pictures, but I also do have over 400GB worth of subs  that I'm still wading through that I produced in the last 4 months.  
Here is what I tried with this camera, these aren't based on calculations, but my own experimentation: 

(

15 seconds (gain30 offset 20)

30 seconds (gain 11 offset30)

45 seconds (gain 11 offset30)
60 seconds (gain 11 offset 30)
120 seconds (gain 11 offset 30)
300 seconds (gain 11 offset 30)

(the gain 11 comes from a recommendation from some users, but now I'm doubting that I should use that, maybe with a LP filter)

equipment used: 

W.O. Z61mm /360mm APO)

The camera has 2.4umx2.4um sensor size  (specs link)
Fullwell at 15k so it's easy to oversaturate, especially with this scope I use.

I still haven't processed all these images, I'm missing darks and bias files (currently producing them), these are just leftover images, that I put aside for later process, because I doubt they are usable. 

 

According to what I learned with this particular camera, next time I should  try like -gain 5, or gain 0-  with maybe 45-60 second subs and actually there is no point making a lot longer subs, maybe 90s max? 

The e-well drops sharply beyond gain 2 or 3, the "unity gain" at gain 10 does not give any advantage, like some ZWO cameras with their mode switch to have the dynamic range jump back. This camera does not do that, the dynamic range decreases in linear way and readout noise decrease in a very gradual way beyond gain 5. 




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics