Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Should we stack only light frames and forget the others, I say yes and I've got the data to prove it?

  • Please log in to reply
587 replies to this topic

#326 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 12:38 PM

Quite obvious this is not really a nuanced debate... it is more about us vs. them. Which is quite ridiculous. The theory is all fine. I understand it just as well as anyone here... why don’t we apply it to the real world.

 

Let me ask the question to the experts in a different way.

 

Let’s say I am imaging from my backyard SQM 18.50 - 19.00 with the 2600MC. How much total contribution will the dark current have relative to LP and other unwanted signals? Noise contribution vs. other sources? Do you think it is material?

 

Lets try the above from a SQM 21.4 site. How much does the 2600 dark current matter?

It depends on the sensor temp. How deeply CAN you cool it? Deep? Or not? If yo are imaging on 85 degree nights, you might find its a bit tough to achieve those ultra low dark current levels. Something I've learned about CMOS is their dark current doubles more frequently than CCD most of the time...where a CCD might double every 6-7C, a CMOS might double somewhere every 4-5C. Many of the current generation CMOS sensors cannot cool more deeply than ~30C delta-T or around there, which can have a definite impact on the amount of dark current you have...and thus, of course, the magnitude of the DFPN. 


  • bobzeq25 likes this

#327 Astrojedi

Astrojedi

    Fly Me to the Moon

  • *****
  • Posts: 5,804
  • Joined: 27 May 2015

Posted 28 December 2020 - 12:41 PM

Whether you calibrate only with biases, or calibrate with darks...and of course calibrate with flats in either case...you MUST calibrate. This is not an option. If you do not calibrate, then the ESSENTIAL and critical normalization process that occurs during image integration to make all the frames optimally compatible with each other from a signal alignment and dispersion standpoint WILL NOT WORK PROPERLY!! 

 

Calibration is not an option. Period. Doesn't matter how good the technology is. 

 

Now, if the debate is just about whether for some particular sensor A, whether you "can" calibrate with "just" a master bias rather than a master dark...well, my you need to determine that on a sensor by sensor (and I'd say even camera by camera) basis. In some cases, you may need to characterize a sensor more than once over its lifetime to answer that question. If there is no notable difference in DFPN between a dark and a bias, then you may be able to calibrate with only a master bias. You will, however, you MUST, however, CALIBRATE! Otherwise, the rest of the math for integration will not work properly. Dithering alone is not going to remove the offsets added to each pixel by the camera circuitry. 

You are mis-characterizing the argument. No one is arguing validity of the imaging theory.

 

 

The question is how does the dark current contribution matter in relation to the other unwanted signal and noise terms in a real world situation for this particular sensor architecture (533/2600). From my SQM 18.5 backyard (or typical skies) I would say not much subjectively or objectively.


Edited by Astrojedi, 28 December 2020 - 12:46 PM.


#328 skysurfer

skysurfer

    Apollo

  • -----
  • Posts: 1,212
  • Joined: 05 Oct 2009
  • Loc: EU, N 52 E 6

Posted 28 December 2020 - 12:42 PM

This depends on the user's requirements, when you are a real pixel peeper, I'd say yes.

 

Darks I don't need for the EOS 6D as it has a very low dark current.

Making flats I did in the past, but it made things worse in many cases.

 

But I learned from Roger Clark's site. that I can also use the lens profile for vignetting to flatten the image and even the individual frames with Adobe Raw.

That does however not apply for telescope objective lenses as there is no precompiled lens profile.

Later on I found an even better idea: synthetic flats.  I followed this tutorial from astrobackyard.com and it works awesome.

 

So now I stack the frames (and sometimes correct contrast / brightness / dehaze / blacks / lens profile, including chromatic aberration) with Adobe Raw beforehand and save it as TIFFs), stack the frames using Siril (or sometimes Photoshop) and then postprocess it by first using synthetic flats. Even in my moderate (Bortle 5) backyard, it works.


Edited by skysurfer, 28 December 2020 - 12:44 PM.


#329 Higgsfield

Higgsfield

    Ranger 4

  • -----
  • Vendors
  • topic starter
  • Posts: 387
  • Joined: 10 Sep 2020

Posted 28 December 2020 - 12:55 PM

ASI533MC Pro and ASI2600MC Pro Dark Frames taken from the ZWO website.

 

https://astronomy-im...no-amp-glow.jpg

 

https://astronomy-im...no-amp-glow.png

 

The white pixels do not seem correlate frame to frame in my experience. My darks look just like the one above for the 533. A super bias is entirely a dark featureless solid grey. 

 

On the issue of light pollution removal. I would suggest that it is more effectively removed along with vignetting in an uncalibrated master light frame. The reason being that dividing the master light by the master flat creates high frequency highs and lows in the residual gradient that is subsequently hard to remove. At least this is what I've found. The more one tries to remove them the higher in frequency they become. An uncalibrated light frame has basic two very low frequency components (vignette + light pollution).  This all assumes dust mots and donuts are not a problem.



#330 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 12:58 PM

Flats: This is where it gets tricky. Flats by definition define the transfer function of the imaging setup. A master flat includes both linear and non-linear components. A dust Mott being an example of a non-linear component. This is really a cleanliness issue! A dust speck on the outmost surface of the telescope will cast a very faint shadow and not likely be visible in a stacked light master, while a water spot on the glass cover of the camera will cast a very deep shadow. That said, if after calibration and integration and there are no non-linear imperfections, and one finds DBE necessary, one will achieve better results from not using flats. Instead, DBE if done correctly, will precisely remove vignetting ( radial gradient), even with a lot of nebulousity. To do this precisely with a master flat is very difficult because the master light is being divided by the master flat. All sorts of additional problems of a non-linear nature can be introduced by this operation. Again, if after calibration using flats, DBE is still required to remove gradients then the image has likely been degraded by the calibration process.

Not sure who said you can get better results with just DBE and no flats...but I VERY, VERY, VERY STRONGLY disagree! Having had to deal with correcting issues like dust motes and even vignetting with DBE in the past, along with LP gradients, I speak from experience and can tell you it is NOT an easy task, and 99% of the time you will lose your data set and be unable to get a good result if you do not calibrate with flats. 

 

Yes, it CAN be possible to correct dust motes with just DBE...it is extremely difficult and usually requires an extremely, extremely dense net of samples, along with multiple passes of correction with aggressive settings. Such a DBE-only correction is usually only possible with a largely empty field...i.e. globs, galaxies, and only when they do not fill the frame. Even in such images, the level of DBE gradient removal that has to be applied is usually devastating to any other structure in the field, and tends to make the field look unrealistic in the background signal (I've done it on more than one occasion). To be quite blunt: any field full of object structure that also has transfer issues that flats normally correct for, is IMPOSSIBLE to fix with just DBE. DBE cannot on its own separate dust motes, vignetting and other shading from other structure in a packed field. 

 

Consider:

 

Original field (mis-corrected dust motes...flats were from a slightly rotated setup, accidental! The ONLY time I would recommend even trying this!):

aKuaXQ2.jpg

 

Field with extremely dense net of DBE samples applied:

IeRYAPN.jpg

 

(Imperfectly) Corrected field:

QrjWt5w.jpg

 

The field is corrected, but also too flat overall, but still not fully corrected in several areas. Even after all that effort (immense effort) to build that dense network of samples. 

 

 

There is another reason to use flats as well. Flats correct for another kind of FPN. Where darks correct for DFPN which is the result of DSNU (dark signal non-uniformity), flats correct for FPN that is a result of PRNU (photo response non-uniformity.) Flats and only flats can correct for this kind of FPN. Note that as you stack more, this kind of FPN, if left uncorrected, can limit SNR much sooner than DFPN. A lot of the time, this kind of FPN is very gaussian so it is one of those "unseen restrictions" on uncalibrated data. Again, dithering can randomize this pattern...but then you have another source of noise (in addition to the normal noises, as well as in addition to DFPN.)

 

Calibration is not an optional thing IMHO. Calibration is an essential part of the process of CORRECTING your raw data so that it can be processed correctly, and so that you can minimize the amount of noise in your data and maximize your SNR. DBE is not even remotely an alternative to flat frames for correcting field issues. Flat frames are exact and know everything about exactly what they are intended to correct, DBE can only model based on the data, it has no prior knowledge of the issues you wish to correct with it, and thus it is your job, the 


  • psandelle, ezwheels and endless-sky like this

#331 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 01:06 PM

You are mis-characterizing the argument. No one is arguing validity of the imaging theory.

 

 

The question is how does the dark current contribution matter in relation to the other unwanted signal and noise terms in a real world situation for this particular sensor architecture (533/2600). From my SQM 18.5 backyard (or typical skies) I would say not much subjectively or objectively.

That's all just assumption.

 

As I've said several times recently in this thread...each user should model (characterize) your sensor's DFPN and figure out what the impact is themselves. Further...how bright your skies are is not in and of itself the determining factor in whether DFPN will affect the data or not. How deeply you are exposing the background sky is really what matters there...and, you can underexpose under bright skies just as much as you can underexpose under dark skies. Its easier to underexpose under dark skies, but its not impossible to underexpose under bright skies. In fact, with certain setups...big aperture setups, I would actually say it is easy to underexpose under any skies, bright or dark, with CMOS sensors at higher gains/HCG modes, due to the rate of stellar saturation. Stellar saturation increases with both reduction in f-ratio AND increase in aperture, whereas extended object saturation is only affected by f-ratio. So stellar saturation accelerates as you change both...short big aperture scopes (i.e. RASA, hyperstar, and similar) are usually the WORST scopes to pair with high gain CMOS sensors as the stars will rip through the limited FWC in short order, well before background sky has reached a reasonable signal level.

 

People often come to me with image processing problems. On many more than one occasion, I've had people come to me with problems that ultimately resulted from having so little actual signal of interest in their field that they could hardly get anything out of the data. Most of those cases were short, fast scopes, although not all (sometimes people just don't expose long enough.) In all cases, the DFPN was clearly apparent through the extremely thin veneer of object and sky signal above it. 

 

There have been many threads on these forums over the years on this very topic covering the issues with star clipping and poor quality signal when using higher gain cameras or simply cameras with smaller FWC on big, fast scopes. It is a well documented problem. One of the main reasons why the object signal looks like crap in such situations? Its extremely shallow, and the DFPN is, in RELATIVE terms, significant as a result.

 

So I would say that both subjectively and objectively, it DEPENDS. tongue2.gif  


Edited by Jon Rista, 28 December 2020 - 01:18 PM.

  • psandelle likes this

#332 sn2006gy

sn2006gy

    Vendor - Rockchuck Summit Observatory

  • ****-
  • Vendors
  • Posts: 1,542
  • Joined: 04 May 2020
  • Loc: Austin, TX

Posted 28 December 2020 - 01:12 PM

I am not "implying" anything...I've been extremely explicit, with exact math, in my posts. There is no implication here. There is simple mathematical fact and clear, exact visual demonstrations. shrug.gif

You're implying math at all cost, ie, math is the bottom line.  Your visual demonstrations are not on the 533/2600/6200 sensors and come from demonstrations where dark calibration is absolutely necessary and more importantly they come from situations where not dark calibrating absolutely impacts the final master lights right?

 

If my 40 dark calibrated sub was enough, why did the discussion turn into walking noise and i quote "if i dark calibrated and dithered more i wouldn't have walking noise" - in which case my final master lights have no walking noise either.

 

I'm curious why the argument feels like entrapment rather than discussion. Everything is framed as if no one can do anything other than agree with maths.

 

Even if we agree with the math - at which point is it enough?

 

(edit, this is in regard to darks only... i have no skin in the not calibrating period lol - other than its quite possible on the 533 and the OP isn't wrong for citing it.. and please know that I value your work extremely so Jon)

 

IE, would i have been fine with 5 darks? 10 darks? 20 darks? Is 100 darks more mathematically superior?

 

And to what ends when any of those calibrations has no bearing on my final master light that I can see? 

 

The only difference i can see in dark calibration was in my rejection subs and those rejection subs made me ask more questions rather than see answers since the rejection of the dark calibrated subs looks exactly like what a non dark calibrated rejection did, only "softer" and if its "softer" does that mean not enough darks were applied?


Edited by sn2006gy, 28 December 2020 - 01:22 PM.


#333 Higgsfield

Higgsfield

    Ranger 4

  • -----
  • Vendors
  • topic starter
  • Posts: 387
  • Joined: 10 Sep 2020

Posted 28 December 2020 - 01:13 PM

Not sure who said you can get better results with just DBE and no flats...but I VERY, VERY, VERY STRONGLY disagree! Having had to deal with correcting issues like dust motes and even vignetting with DBE in the past, along with LP gradients, I speak from experience and can tell you it is NOT an easy task, and 99% of the time you will lose your data set and be unable to get a good result if you do not calibrate with flats. 

 

Yes, it CAN be possible to correct dust motes with just DBE...it is extremely difficult and usually requires an extremely, extremely dense net of samples, along with multiple passes of correction with aggressive settings. Such a DBE-only correction is usually only possible with a largely empty field...i.e. globs, galaxies, and only when they do not fill the frame. Even in such images, the level of DBE gradient removal that has to be applied is usually devastating to any other structure in the field, and tends to make the field look unrealistic in the background signal (I've done it on more than one occasion). To be quite blunt: any field full of object structure that also has transfer issues that flats normally correct for, is IMPOSSIBLE to fix with just DBE. DBE cannot on its own separate dust motes, vignetting and other shading from other structure in a packed field. 

 

Consider:

 

Original field (mis-corrected dust motes...flats were from a slightly rotated setup, accidental! The ONLY time I would recommend even trying this!):

aKuaXQ2.jpg

 

Field with extremely dense net of DBE samples applied:

 

 

I think this is an excellent example of what I've been saying. The poor calibration due to the rotation resulted in high frequency gradients that are hard if not impossible to remove with DBE. Low frequency gradients are not so hard to remove with DBE and do not require lots of samples, just the right sample pattern. Do you by chance have an uncalibrated light master (or just calibrated with bias/darks, no flats)?



#334 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 01:15 PM

I think this is an excellent example of what I've been saying. The poor calibration due to the rotation resulted in high frequency gradients that are hard if not impossible to remove with DBE. Low frequency gradients are not so hard to remove with DBE and do not require lots of samples, just the right sample pattern. Do you by chance have an uncalibrated light master (or just calibrated with bias/darks, no flats)?

I have the data somewhere, I never delete anything. I don't know where at the moment, though. I can say though that the field without flat calibration was a horror. It always is when dust motes are in play.


  • bobzeq25 likes this

#335 Astrojedi

Astrojedi

    Fly Me to the Moon

  • *****
  • Posts: 5,804
  • Joined: 27 May 2015

Posted 28 December 2020 - 01:20 PM

That's all just assumption.

 

As I've said several times recently in this thread...each user should model (characterize) your sensor's DFPN and figure out what the impact is themselves. Further...how bright your skies are is not in and of itself the determining factor in whether DFPN will affect the data or not. How deeply you are exposing the background sky is really what matters there...and, you can underexpose under bright skies just as much as you can underexpose under dark skies. Its easier to underexpose under dark skies, but its not impossible to underexpose under bright skies. In fact, with certain setups...big aperture setups, I would actually say it is easy to underexpose under any skies, bright or dark, with CMOS sensors at higher gains/HCG modes, due to the rate of stellar saturation. Stellar saturation increases with both reduction in f-ratio AND increase in aperture, whereas extended object saturation is only affected by f-ratio. So stellar saturation accelerates as you change both...short big aperture scopes (i.e. RASA, hyperstar, and similar) are usually the WORST scopes to pair with high gain CMOS sensors as the stars will rip through the limited FWC in short order, well before background sky has reached a reasonable signal level.

 

There have been many threads on these forums over the years on this very topic covering the issues with star clipping and poor quality signal when using higher gain cameras or simply cameras with smaller FWC on big, fast scopes. It is a well documented problem. One of the main reasons why the object signal looks like crap in such situations? Its extremely shallow, and the DFPN is, in RELATIVE terms, significant as a result.

 

So I would say that both subjectively and objectively, it DEPENDS. tongue2.gif  

Sorry Jon but you are generalizing again. You did not answer my very specific question...
 



#336 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 01:24 PM

You're implying math at all cost, ie, math is the bottom line.  Your visual demonstrations are not on the 533/2600/6200 sensors and come from demonstrations where dark calibration is absolutely necessary and more importantly they come from situations where not dark calibrating absolutely impacts the final master lights right?
 
If my 40 dark calibrated sub was enough, why did the discussion turn into walking noise and i quote "if i dark calibrated and dithered more i wouldn't have walking noise" - in which case my final master lights have no walking noise either.
 
I'm curious why the argument feels like entrapment rather than discussion. Everything is framed as if no one can do anything other than agree with maths.
 
Even if we agree with the math - at which point is it enough?
 
IE, would i have been fine with 5 darks? 10 darks? 20 darks? Is 100 darks more mathematically superior?
 
And to what ends when any of those calibrations has no bearing on my final master light that I can see? 
 
The only difference i can see in dark calibration was in my rejection subs and those rejection subs made me ask more questions rather than see answers since the rejection of the dark calibrated subs looks exactly like what a non dark calibrated rejection did, only "softer" and if its "softer" does that mean not enough darks were applied?

Math is not an implication. tongue2.gif It's just math. The math speaks for itself...and my math wasn't based on any particular sensor, just on the read noise at lower gains that is fairly common to ANY of those sensors, or many others.
 
As for how much is enough...do you simply not understand what I've said in my previous posts? I've already explained to you what is enough. In very clear, explicit, and exact terms. What do you not understand about my prior posts? Are you not able to run the math yourself for 5, 10, 20 and 100 darks, with any amount of read noise, any amount of object and sky signal, and determine, WITH THE EXPLICIT MATH, what is or is not enough? I've stated clearly that a 0.04e- impact from the remnant read noise in a 40-frame master dark, with 3.5e- read noise, is UTTERLY MOOT, and when you add the object and sky signals, LESS than UTTERLY MOOT. How does that not compute??? How are you not able to extrapolate from that, and apply just the concept to ANY other sensor, read noise, object signal???
 
gaah.gif


  • psandelle and bobzeq25 like this

#337 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • Posts: 6,113
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 28 December 2020 - 01:24 PM

ASI533MC Pro and ASI2600MC Pro Dark Frames taken from the ZWO website.

 

https://astronomy-im...no-amp-glow.jpg

 

https://astronomy-im...no-amp-glow.png

 

The white pixels do not seem correlate frame to frame in my experience. 

That's what I'm very interested to see e.g. by looking at 3 or 4 consecutive darks to demonstrate there's no frame-to-frame correlation between the white pixels.

 

Mark


Edited by sharkmelley, 28 December 2020 - 01:32 PM.


#338 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • Posts: 6,113
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 28 December 2020 - 01:24 PM

BTW, if you want something interesting (but totally off topic) to investigate, why does the FFT of the ASI2600MC PRO master dark look like this:

 

attachicon.gifASI2600MCPRO.png

 

Mark

 

Good question - how did you generate this? I'm curious.

 

AND.. that looks a lot more like the pattern I see "smashed all over the place" in my rejection subs.

 

Also, does anyone have this from other cameras?  And how would dark calibration even help here? 

I generated it from the master dark you generously made available earlier in this thread by taking a 2048x2048 crop of the centre and applying the PixInsight FourierTransform process.

 

A Fourier transform of a long exposure dark normally contains unstructured random noise.  I have no idea of the cause of what we're seeing here but I'm very intrigued.

 

Mark 



#339 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 31,275
  • Joined: 27 Oct 2014

Posted 28 December 2020 - 01:28 PM

There are a couple of catch-22s going on here.

 

The obvious one is that, if you don't do the calibration frames well, they'll hurt your images.  If you do them well they'll help.  The difficulty involved in doing them well varies by camera.  It's quite hard to do darks really well with an uncooled DSLR.

 

The unobvious one.  If you're new at this doing the calibration frames may seem unnecessary, since your images are not good enough to show the differences.  But, if you don't do any calibration frames (ie, "lights only") you're unlikely to get good enough at data acquisition and processing to _really_ see what the calibration frames do.  Not to mention not bothering to look.

 

I'm not sure there's a cure for the second.  But, there is a general rule, though some will avoid it.

 

It's always best to assume the best and most experienced imagers do what they do for good reasons.  Betting that they make silly mistakes is unlikely to be a winning bet.

 

At this point I've said my piece, and will exit this particular field of battle.  <smile>


Edited by bobzeq25, 28 December 2020 - 02:40 PM.

  • psandelle and endless-sky like this

#340 Xentex

Xentex

    Vostok 1

  • -----
  • Posts: 158
  • Joined: 10 Nov 2015
  • Loc: Philadelphia Suburbs

Posted 28 December 2020 - 01:32 PM

Seems to me there's some factual things almost everyone can agree on here, some opinion things that people are never going to agree on, and a bunch of semantics that people still feel like arguing about.

 

Things we can agree on:

- if you have a perfect optical train

- coupled with a perfect sensor

- then your calibration frames will contain no useful information

- so you can skip the whole darks, bias, flats thing.

 

- no camera or optical train is perfect

- but stuff is getting better

- to the point where some people don't feel the benefits of (step X) are worth the effort for them

 

Agree to disagree on:

- it's good enough for me, so it's good enough

- it's not good enough for me, so it's not good enough


  • psandelle likes this

#341 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 01:33 PM

Sorry Jon but you are generalizing again. You did not answer my very specific question...
 

You didn't read my whole post if you think it doesn't apply as specifically to the specific sensor you specified under those specific skies (even though it concurrently applies to any other setup). 

 

My generalization applies, period. IT DEPENDS. I don't know what scope you are using. I don't know what scopes other people WITH THAT SENSOR, and THOSE SKIES, might be using. IT DEPENDS!

 

You seem to be trying to trap me into answering a particular way. Sorry. What I said applies directly to your specific scenario. Now, if you give me an even more explicit scenario, I might be able to give you a more explicit answer...but I won't. At this point its all just become wordplay and traps, and that is completely useless to the general topic of the thread or other readers who might be wondering whether to calibrate or not. Which, BTW, is the explicit topic of the thread...not whether to calibrate with biases rather than darks, but whether to even bother taking dark, bias and flat frames AT ALL. My answer to THAT question is a RESOUNDING YES! Take ALL the frames, and DO CALIBRATE. ALWAYS. 


  • psandelle, bobzeq25 and idclimber like this

#342 sn2006gy

sn2006gy

    Vendor - Rockchuck Summit Observatory

  • ****-
  • Vendors
  • Posts: 1,542
  • Joined: 04 May 2020
  • Loc: Austin, TX

Posted 28 December 2020 - 01:35 PM

Math is not an implication. tongue2.gif It's just math. The math speaks for itself...and my math wasn't based on any particular sensor, just on the read noise at lower gains that is fairly common to ANY of those sensors, or many others.
 
As for how much is enough...do you simply not understand what I've said in my previous posts? I've already explained to you what is enough. In very clear, explicit, and exact terms. What do you not understand about my prior posts? Are you not able to run the math yourself for 5, 10, 20 and 100 darks, with any amount of read noise, any amount of object and sky signal, and determine, WITH THE EXPLICIT MATH, what is or is not enough? I've stated clearly that a 0.04e- impact from the remnant read noise in a 40-frame master dark, with 3.5e- read noise, is UTTERLY MOOT, and when you add the object and sky signals, LESS than UTTERLY MOOT. How does that not compute??? How are you not able to extrapolate from that, and apply just the concept to ANY other sensor, read noise, object signal???
 
gaah.gif

Full circle argument here Jon.

 

We're back to dark current and read noise instead of FPN and walking noise.

 

And no one is talking about the data I shared to try and focus it.  I'd just like an answer if 40 darks passes muster, not a reboot of the entire debate.

 

(and no... i guess i don't really care for an answer - not seeking approval in these questions if that's what people are thinking and through the style of debate i can see, it will be a non answer that makes me question our sanity rather than coming to meet in the middle based on real world experience and data provided)

 

Seems to me there's some factual things almost everyone can agree on here, some opinion things that people are never going to agree on, and a bunch of semantics that people still feel like arguing about.

 

Things we can agree on:

- if you have a perfect optical train

- coupled with a perfect sensor

- then your calibration frames will contain no useful information

- so you can skip the whole darks, bias, flats thing.

 

- no camera or optical train is perfect

- but stuff is getting better

- to the point where some people don't feel the benefits of (step X) are worth the effort for them

 

Agree to disagree on:

- it's good enough for me, so it's good enough

- it's not good enough for me, so it's not good enough

 

additionally no calibration is perfect..  well.. i guess nothing is perfect.


Edited by sn2006gy, 28 December 2020 - 01:44 PM.


#343 sn2006gy

sn2006gy

    Vendor - Rockchuck Summit Observatory

  • ****-
  • Vendors
  • Posts: 1,542
  • Joined: 04 May 2020
  • Loc: Austin, TX

Posted 28 December 2020 - 01:47 PM

I generated it from the master dark you generously made available earlier in this thread by taking a 2048x2048 crop of the centre and applying the PixInsight FourierTransform process.

 

A Fourier transform of a long exposure dark normally contains unstructured random noise.  I have no idea of the cause of what we're seeing here but I'm very intrigued.

 

Mark 

Is it a symptom of OSC?

 

Otherwise, I'm curious what the 6200 darks look like... i'm not sure if i still have any darks from my 533 to compare



#344 Astrojedi

Astrojedi

    Fly Me to the Moon

  • *****
  • Posts: 5,804
  • Joined: 27 May 2015

Posted 28 December 2020 - 01:48 PM

You didn't read my whole post if you think it doesn't apply as specifically to the specific sensor you specified under those specific skies (even though it concurrently applies to any other setup). 

 

My generalization applies, period. IT DEPENDS. I don't know what scope you are using. I don't know what scopes other people WITH THAT SENSOR, and THOSE SKIES, might be using. IT DEPENDS!

 

You seem to be trying to trap me into answering a particular way. Sorry. What I said applies directly to your specific scenario. Now, if you give me an even more explicit scenario, I might be able to give you a more explicit answer...but I won't. At this point its all just become wordplay and traps, and that is completely useless to the general topic of the thread or other readers who might be wondering whether to calibrate or not. Which, BTW, is the explicit topic of the thread...not whether to calibrate with biases rather than darks, but whether to even bother taking dark, bias and flat frames AT ALL. My answer to THAT question is a RESOUNDING YES! Take ALL the frames, and DO CALIBRATE. ALWAYS. 

I am not trying to trap you. That is what I am debating but you keep answering a question no one is debating here. You are constantly speaking in generalizations and trying to reinforce the validity of imaging theory which no one is debating in the first place.

 

If you read my earlier posts I recommend calibration for best results but I am also not dogmatic about it. From what I am seeing dark calibration is not making much of difference to the end result for the 2600MC from my light polluted skies (or even from my club’s SQM ~21.4 ‘dark site’ for that matter).

 

And before you twist my argument again... even if I decide not to use darks I would still use flats (with bias subtracted) so I am not arguing against all calibration which would be silly to say the least. 


Edited by Astrojedi, 28 December 2020 - 02:00 PM.


#345 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 01:49 PM

Full circle argument here Jon.
 
We're back to dark current and read noise instead of FPN and walking noise

So you don't seem to understand the basics of master dark frames, and what it means to "add noise with dark calibration." 
 
You asked how many dark frames is enough? THAT question, only has to do with the remaining read noise in the master dark. Well, I guess to put it a more accurate way, only has to do with the remaining RANDOM noise in the master dark...read noise is usually the primary random term, but there may also be random banding, etc. Random pattern is rarer, and if you have some, the answer might be different...depends a lot on each specific sensor and exactly what kind of random patterns they may exhibit, and that requires characterizing each specific sensor. I cannot give a definitive answer there, but its a rarer issue.
 
For the question about how many dark frames are enough to stack into a master, that has to do with read noise. Averaging has the effect of reducing read noise. Technically speaking, its the same general formula for the SNR of a stack of frames:
 
((Sobj + Ssky) * Csubs)/SQRT(Csubs * (Sobj + Ssky + Sdark + Nread^2))
 
For dark frames, the signal of interest is in fact the dark signal itself, so you would reduce that to:

 

(Sdark * Cframes)/SQRT(Cframes * (Sdark + Nread^2))
 
However with averaging, in the end we divide the results by the number of subs. So instead of both signal and noise increasing, signal stays the same, and noise diminishes. The SOLE PURPOSE of stacking dark frames, is to average down the RANDOM NOISE TERMS to the point where they become moot relative to the noise in each light frame. 
 
THAT is why I was talking about read noise. In order to fully characterize the DFPN, the dark FIXED pattern noise, you have to average down the RANDOM noises to the point where they don't have any meaningful impact to the total noise in the end. Well, when your other noise terms are 5, 8, 10, 15 electrons or more, half an electron worth of remnant read noise in a 40 frame master dark IS, INDEED, MOOT. No meaningful impact of any kind. 
 
Do you understand that now? 
 
If so...then, once you have a CLEAN, WELL CHARACTERIZED master dark, you can calibrate hundreds of light frames with it, without concern that it will be adding noise to your image. It will be on the order of hundredths of an electron, being so drowned in the other noise terms it simply does not matter. So...how many dark frames are enough? You have the mathematical tools to determine this for yourself for any sensor, any amount of read noise, dark current noise, and even object+sky signal. You should even be able to extrapolate, fairly easily, without actually running the numbers yourself...that, well...that's an exercise for the reader. 


Edited by Jon Rista, 28 December 2020 - 02:00 PM.

  • bobzeq25 likes this

#346 sn2006gy

sn2006gy

    Vendor - Rockchuck Summit Observatory

  • ****-
  • Vendors
  • Posts: 1,542
  • Joined: 04 May 2020
  • Loc: Austin, TX

Posted 28 December 2020 - 01:56 PM

That's what I'm very interested to see e.g. by looking at 3 or 4 consecutive darks to demonstrate there's no frame-to-frame correlation between the white pixels.

 

Mark

Uploaded a bunch of consecutive darks to OneDrive that were used to generate the ones linked in this thread.

 

https://1drv.ms/u/s!...FInjvQ?e=2tepM6



#347 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 01:59 PM

I am not trying to trap you. That is what I am debating but you keep answering a question no one is debating here. You are constantly speaking in generalizations and trying to reinforce the validity of imaging theory which no one is debating in the first place.

 

If you read my earlier posts I recommend calibration for best results but I am also not dogmatic about it. From what I am seeing dark calibration is not making much of difference to the end result for the 2600MC from my light polluted skies (or even from my club’s SQM ~21.4 ‘dark site’ for that matter).

This entire thread is a debate about the original post... The original post clearly asked the question, should people even bother taking any frame other than light frames... 

 

That IS the debate...

 

You keep hyper-specializing the question to your one specific personal use case... I honestly cannot answer that. You see what you see, but I have no idea what you look for, or how you interpret what you see in your images. I am also not real sure if you are replacing dark calibration with bias calibration, or not calibrating at all. (If you are doing the latter, not calibrating, then I can certainly make certain assumptions, but then I'd be assuming, and that's all). I don't know what scopes you use. Etc. etc. I cannot debate something like that. So I don't. I debate what I do know, and I try to provide answers that will be useful to the readers of the thread. 


  • psandelle and bobzeq25 like this

#348 Astrojedi

Astrojedi

    Fly Me to the Moon

  • *****
  • Posts: 5,804
  • Joined: 27 May 2015

Posted 28 December 2020 - 02:05 PM

This entire thread is a debate about the original post... The original post clearly asked the question, should people even bother taking any frame other than light frames... 

 

That IS the debate...

 

You keep hyper-specializing the question to your one specific personal use case... I honestly cannot answer that. You see what you see, but I have no idea what you look for, or how you interpret what you see in your images. I am also not real sure if you are replacing dark calibration with bias calibration, or not calibrating at all. (If you are doing the latter, not calibrating, then I can certainly make certain assumptions, but then I'd be assuming, and that's all). I don't know what scopes you use. Etc. etc. I cannot debate something like that. So I don't. I debate what I do know, and I try to provide answers that will be useful to the readers of the thread. 

See my modified post above... somehow I knew you were going to twist my argument again. 

 

I am not ‘hyper-specializing’. This thread is about the 533 and 2600 sensors. The discussion is about dark frames as it would be silly to suggest the camera can somehow ‘fix’ the optical system and eliminate the need for flats.

 

And most imagers have typical skies like mine not some hypothetical perfect dark skies you keep assuming.


Edited by Astrojedi, 28 December 2020 - 02:06 PM.


#349 sn2006gy

sn2006gy

    Vendor - Rockchuck Summit Observatory

  • ****-
  • Vendors
  • Posts: 1,542
  • Joined: 04 May 2020
  • Loc: Austin, TX

Posted 28 December 2020 - 02:11 PM

So you don't seem to understand the basics of master dark frames, and what it means to "add noise with dark calibration." 
 
You asked how many dark frames is enough? THAT question, only has to do with the remaining read noise in the master dark. Well, I guess to put it a more accurate way, only has to do with the remaining RANDOM noise in the master dark...read noise is usually the primary random term, but there may also be random banding, etc. Random pattern is rarer, and if you have some, the answer might be different...depends a lot on each specific sensor and exactly what kind of random patterns they may exhibit, and that requires characterizing each specific sensor. I cannot give a definitive answer there, but its a rarer issue.
 
For the question about how many dark frames are enough to stack into a master, that has to do with read noise. Averaging has the effect of reducing read noise. Technically speaking, its the same general formula for the SNR of a stack of frames:
 
((Sobj + Ssky) * Csubs)/SQRT(Csubs * (Sobj + Ssky + Sdark + Nread^2))
 
For dark frames, the signal of interest is in fact the dark signal itself, so you would reduce that to:

 

(Sdark * Cframes)/SQRT(Cframes * (Sdark + Nread^2))
 
However with averaging, in the end we divide the results by the number of subs. So instead of both signal and noise increasing, signal stays the same, and noise diminishes. The SOLE PURPOSE of stacking dark frames, is to average down the RANDOM NOISE TERMS to the point where they become moot relative to the noise in each light frame. 
 
THAT is why I was talking about read noise. In order to fully characterize the DFPN, the dark FIXED pattern noise, you have to average down the RANDOM noises to the point where they don't have any meaningful impact to the total noise in the end. Well, when your other noise terms are 5, 8, 10, 15 electrons or more, half an electron worth of remnant read noise in a 40 frame master dark IS, INDEED, MOOT. No meaningful impact of any kind. 
 
Do you understand that now? 
 
If so...then, once you have a CLEAN, WELL CHARACTERIZED master dark, you can calibrate hundreds of light frames with it, without concern that it will be adding noise to your image. It will be on the order of hundredths of an electron, which will be so drowned in the other noise terms it simply does not matter. So...how many dark frames are enough? You have the mathematical tools to determine this for yourself for any sensor, any amount of read noise, dark current noise, and even object+sky signal. You should even be able to extrapolate, fairly easily, without actually running the numbers yourself...that, well...that's an exercise for the reader. 

 

When I didn't understand this, I re-read your blog until i did. So i thank you for such a great blog. 

 

Just so its clear,  i'm not that concerned with adding noise because of calibration... that's mostly a side effect of debating the nuances of SNR and calibration noise as I can't see ANY additional noise in my dark calibrated lights just as I don't see any different noise in my non dark calibrated lights..  its neither here nor there as far as i can visually tell.. but i agree its "technically in there" for lack of better words.

 

Please keep in mind, i look up to what you have done Jon, In my head debating this is like having time with a professor who knows why things were done the way they were done and teaches those very well... think of me as a student working on his own experiment.

 

In this experiment, If the data can speak for itself, why would you try and convince your student he's wrong just because with prior iterations of different tech it was absolutely necessary and painfully obvious as to its necessity in every which way - visually and mathematically?



#350 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,370
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 28 December 2020 - 02:12 PM

See my modified post above... somehow I knew you were going to twist my argument again. 

 

I am not ‘hyper-specializing’. This thread is about the 533 and 2600 sensors. The discussion is about dark frames as it would be silly to suggest the camera can somehow ‘fix’ the optical system and eliminate the need for flats.

 

And most imagers have typical skies like mine not some hypothetical perfect dark skies you keep assuming.

My answers apply to the 533 and 2600. I have also not assumed any specific context...one of my previous replies explicitly covered light polluted skies and situations where DFPN can be a big problem.

 

You keep twisting my arguments to make them seem irrelevant, when they apply to the two cameras you've mentioned, that the OP said it MAY only apply to, and to a thread where the OP also explicitly asked the question about whether we should take calibration frames at all

 

You've narrowed the context. You keep narrowing the context. My answers apply to the 533 and 2600 as much as to any other camera. My generalizations are not invalid simply because they are generalizations...they STILL APPLY, even to those two cameras. 

 

Finally, "most imagers" as a collective use a very wide variety of equipment. The 533 may be OSC, however you can still use multi-band narrow bandpass LP filters with them, which changes the whole "typical skies like yours" argument, and of course there will be a mono version of the 2600, there are mono versions of other sensors, for which narrow band imaging under "typical skies" will likely be a, if not the, common use case, etc.

 

You keep narrowing the context. Stop that. My answers apply to these sensors as much as to any other. 


  • psandelle likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics