Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

narrow band heart nebula processing

  • Please log in to reply
16 replies to this topic

#1 kg7

kg7

    Vostok 1

  • *****
  • topic starter
  • Posts: 135
  • Joined: 22 Oct 2022
  • Loc: Washington State

Posted 05 December 2024 - 07:35 PM

Santa came early and delivered my f/2 version of the l-extreme narrow band filter, so I spent a night under B3 skies trying it out.  This is my first time using a narrow band filter, and I've heard that people do some strange pixel math to extract hubble-palette type colors from OSC cameras but I'm not sure what I'm missing.  I didn't do any extractions at all, just played with the color balances a little in gimp, and the results seem pretty comparable?  Can I do better with a different process?

 

120 minutes of 10 minutes subs

C8 hyperstar and ASI2600mc

l-extreme f/2 filter

processed with free software only: siril, graxpert, gimp, seti astro suite

 

Looking at it with a critical eye, I think I've overexposed some of the brighter parts and probably pushed the sharpening too far.  I'm also still really struggling with processing the stars in a way that I like.  Any other suggestions for improvement?

 

gallery_428272_24706_4327913.jpg


Edited by kg7, 05 December 2024 - 09:19 PM.

  • Sean McDaniel, gdi, zveck and 4 others like this

#2 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 05 December 2024 - 08:22 PM

600" at f/2 sounds long. Did you back off the gain to 0 or lower?


  • dswtan and Jim Waters like this

#3 kg7

kg7

    Vostok 1

  • *****
  • topic starter
  • Posts: 135
  • Joined: 22 Oct 2022
  • Loc: Washington State

Posted 05 December 2024 - 09:26 PM

600" at f/2 sounds long. Did you back off the gain to 0 or lower?

I think it's less long when using a narrow band filter with a 7nm band pass.

 

I use sharpcap's smart-histogram feature, which takes a series of test exposures at various lengths and gains and then recommends capture settings that match your chosen goals (usually maximizing dynamic range).  In this case it wanted to go longer than 10 minutes based on the very dark skies, but I had 10 minutes set as my maximum allowable exposure length.  The gain was actually pretty high, about 300, and that put the histogram in the left half of the range.

 

I did two other targets after this one and sharpcap recommended 10-minute exposures for both of those as well (though with slightly lower gain settings).  It normally recommends captures of ~30s when I'm shooting without filters from dark skies, and more like 4s when shooting unfiltered from my B7/8 home.  I did have a handful of saturated stars in this image, but most of them were fine.



#4 psienide

psienide

    Viking 1

  • -----
  • Posts: 681
  • Joined: 06 Feb 2023
  • Loc: Frisco, TX

Posted 05 December 2024 - 09:40 PM

I think it looks good considering the process you described. Extracting the channels will give you more control over the results. If it's B3 data, I would be giving it best effort considering your home situation.


  • zveck likes this

#5 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 05 December 2024 - 10:03 PM

I think it's less long when using a narrow band filter with a 7nm band pass.

 

I use sharpcap's smart-histogram feature, which takes a series of test exposures at various lengths and gains and then recommends capture settings that match your chosen goals (usually maximizing dynamic range).  In this case it wanted to go longer than 10 minutes based on the very dark skies, but I had 10 minutes set as my maximum allowable exposure length.  The gain was actually pretty high, about 300, and that put the histogram in the left half of the range.

 

I did two other targets after this one and sharpcap recommended 10-minute exposures for both of those as well (though with slightly lower gain settings).  It normally recommends captures of ~30s when I'm shooting without filters from dark skies, and more like 4s when shooting unfiltered from my B7/8 home.  I did have a handful of saturated stars in this image, but most of them were fine.

Using gain 300 is a mistake. There is no advantage.

 

I am happy to inspect a sub exposure, or show you how to do so using PIxInsight. The main thing we look at is how high those brighter areas are max ADU. 

 

FYI my skies are darker (Bortle 2) and I shoot 300 or 600" with a 8nm filter at f/6. I typically use gain 100 but have also used gain 0. 


Edited by idclimber, 05 December 2024 - 11:08 PM.


#6 fewayne

fewayne

    Gemini

  • *****
  • Posts: 3,067
  • Joined: 10 Sep 2017
  • Loc: Madison, WI, USA

Posted 05 December 2024 - 10:59 PM

Interesting that SharpCap would recommend those long exposures. But yeah, gain that high on a ZWO camera (others use different scales in the software) gives up a ton of dynamic range for no real, erm, gain. 600" at f/2 should be ample to crush read noise. In B5 I'm using 600" at 173 gain with a 7nm filter at f/8.



#7 kg7

kg7

    Vostok 1

  • *****
  • topic starter
  • Posts: 135
  • Joined: 22 Oct 2022
  • Loc: Washington State

Posted 06 December 2024 - 01:11 AM

 

I am happy to inspect a sub exposure, or show you how to do so using PIxInsight. The main thing we look at is how high those brighter areas are max ADU. 

I'm not a PI user but I've become fairly adept with Siril.  What values should I be comparing to?  How do I determine what max ADU should be? 

 

I can mouseover my images in Siril and get a readout for x,y, and a value that I'm assuming is pixel intensity just because they're bigger numbers on brighter parts of the image.  They top out at around 65k, but that number doesn't really mean anything to me yet.

 

From Siril's stat tool on my first sub, not that I know what these numbers are supposed to be:

mean: 6523.9

median: 4788.0

sigma 5806.2

avgDev: 2929.6

MAD: 1436

sqrt(BWMV): 2391.0

min: 0.0

max: 65535

 

And here's that sub's histogram from sharpcap.

 

The 2600mc spec sheet suggests that at 300 gain full well depth is only 11k, if I'm reading it correctly.  In which case, the 65k number confuses me.

 

So I will take your advice and use shorter 1-2 minute subexposures next time I try this filter.  In the meantime, I still have these 2 hours of data with the longer subs and they did generate a picture.  What's the impact of running the subs for too long?  I'm generally pretty happy with my first attempt at using an NB filter, but if I can do even better with shorter subs I'll be happy with that too.



#8 kg7

kg7

    Vostok 1

  • *****
  • topic starter
  • Posts: 135
  • Joined: 22 Oct 2022
  • Loc: Washington State

Posted 06 December 2024 - 01:29 AM

Interesting that SharpCap would recommend those long exposures. But yeah, gain that high on a ZWO camera (others use different scales in the software) gives up a ton of dynamic range for no real, erm, gain. 600" at f/2 should be ample to crush read noise. In B5 I'm using 600" at 173 gain with a 7nm filter at f/8.

I suspect that it's because sharpcap is analyzing the darkest part of the FOV for noise ratios, and not considering that the bright areas might saturate at the settings that are optimal for faint object detection.  Seems like it would be an easy software fix.  Or I could just learn how to use my gear correctly.



#9 EvilGarfield

EvilGarfield

    Lift Off

  • -----
  • Posts: 8
  • Joined: 08 Nov 2024

Posted 06 December 2024 - 01:30 AM

Gain does not influence the highest possible pixel value displayed in the image. It determines how much light one needs to fill the well(pixel)and reach saturation.
You can find an explanation here: https://clarkvision.com/articles/iso/

#10 rj144

rj144

    Fly Me to the Moon

  • -----
  • Posts: 6,006
  • Joined: 31 Oct 2020

Posted 06 December 2024 - 02:03 AM

I'm not a PI user but I've become fairly adept with Siril.  What values should I be comparing to?  How do I determine what max ADU should be? 

 

I can mouseover my images in Siril and get a readout for x,y, and a value that I'm assuming is pixel intensity just because they're bigger numbers on brighter parts of the image.  They top out at around 65k, but that number doesn't really mean anything to me yet.

 

From Siril's stat tool on my first sub, not that I know what these numbers are supposed to be:

mean: 6523.9

median: 4788.0

sigma 5806.2

avgDev: 2929.6

MAD: 1436

sqrt(BWMV): 2391.0

min: 0.0

max: 65535

 

And here's that sub's histogram from sharpcap.

 

The 2600mc spec sheet suggests that at 300 gain full well depth is only 11k, if I'm reading it correctly.  In which case, the 65k number confuses me.

 

So I will take your advice and use shorter 1-2 minute subexposures next time I try this filter.  In the meantime, I still have these 2 hours of data with the longer subs and they did generate a picture.  What's the impact of running the subs for too long?  I'm generally pretty happy with my first attempt at using an NB filter, but if I can do even better with shorter subs I'll be happy with that too.

65k is the bit depth:  2^16.



#11 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 06 December 2024 - 01:20 PM

I'm not a PI user but I've become fairly adept with Siril.  What values should I be comparing to?  How do I determine what max ADU should be? 

 

I can mouseover my images in Siril and get a readout for x,y, and a value that I'm assuming is pixel intensity just because they're bigger numbers on brighter parts of the image.  They top out at around 65k, but that number doesn't really mean anything to me yet.

 

From Siril's stat tool on my first sub, not that I know what these numbers are supposed to be:

mean: 6523.9

median: 4788.0

sigma 5806.2

avgDev: 2929.6

MAD: 1436

sqrt(BWMV): 2391.0

min: 0.0

max: 65535

 

And here's that sub's histogram from sharpcap.

 

The 2600mc spec sheet suggests that at 300 gain full well depth is only 11k, if I'm reading it correctly.  In which case, the 65k number confuses me.

 

So I will take your advice and use shorter 1-2 minute subexposures next time I try this filter.  In the meantime, I still have these 2 hours of data with the longer subs and they did generate a picture.  What's the impact of running the subs for too long?  I'm generally pretty happy with my first attempt at using an NB filter, but if I can do even better with shorter subs I'll be happy with that too.

I will first give you the short answers: Set the gain at either 0 or 100 on this camera. If you can set the offset pick 30 to 50. For a temperature select one that is lower than your expected low temperatures in your sessions. Do not go too low or the cooler runs over about 70- 80% and you loose thermal regulation. Lower temps are not better. 

 

Next let's get into the weeds. 

 

You have a 16 bit sensor and each pixel can record a value from 0 to 65,535. On the stats report you show you have maxed pixels and the mean of the sub around 6,500 ADU. Let's first compare that to your bias or dark. 

 

First it is helpfull to note what the offset is set at. Open either in Siri and report the mean. If it is 503 then the offset on the camera is 50. If it is 703, then the offset was set at 70... The actual recorded value of the camera in a dark or bias is low single digits. The offset is simply a value added to it so it stays away from zero. Zero is bad... and so is max ADU which is 65,535.

 

A offset of 50 on this camera adds 10 times that about to the value before it is saved on your computer. 

 

I encourage you to follow along in Siri and do this with some of your lights and your calibration frames. Note the values of the mean in ADU. You can inspect single pixels if you zoom in. Notice the variability on the bias/dark.. How many ADU is that variability? Now look at a master dark or bias and notice the change or reduction in that variability. This change is due to the stacking and it is averaging the values, and as such the variability goes down. 

 

We have two contrarian goals with an exposure. On the low end we want to expose long enough that the background of our image is above that read noise. If the bias is 500 ADU we need it above that.. How much I will get into in a subsequent post 

 

On the other end is clipped high pixels. Once the pixels are at that max we loose all color information. This is fine for the center of very bright stars, and can't be avoided. What we don't want is detail in the brighter areas of your image to be near that max. White looks bad, so we can monitor how high those areas are, or how many pixels in our image we have. 

 

In the next post I will go into both goals in more detail. 


Edited by idclimber, 06 December 2024 - 01:22 PM.


#12 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 06 December 2024 - 01:35 PM

First I want to address the high end. It is closer to your primary issue of overexposure. 

 

My first suggestion is to look at your subframe or integration in Siri and note the value of the brighter areas of the nebula you mentioned you were having trouble with. How close in ADU are those areas to the max of 65,535? Are any of those areas at the Max?

 

Here is the easy way to look for clipped areas in an image. Look at the sub in the linear format, unstretched. it should look black excluding a handful of brighter stars. Any detail should be black. Any detail that shows in linear view will be very near overexposed when stretched and lack color, contrast and detail. 

 

Another common method is to count the number of maxed out pixels and keep it below about 500 to 1000. As far as I know only NINA and Voyager give you that statistic at image capture. Otherwise we have to resort to a pixel math expression. Helpful, but not handy at the time of image capture. 

 

Here is a link to my IC 1805 (Heart Nebula) image taken for the monthly challenge back in 2022. This was with my SVX102 refractor and my ASI2600mm camera. Note my exposures for it were only 300" and this is with a gain of 100 and a focal ration of f/5. 

 

 

get.jpg?insecure

 

 


Edited by idclimber, 06 December 2024 - 01:40 PM.


#13 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 06 December 2024 - 02:08 PM

Now I will address the low end of exposures. I strongly recommend you watch the following presentation by Dr Robin Glover. It is considered one of the primary sources around here on exposures with CMOS cameras. The basic conclusion is we do not need long exposures with these modern cameras. 

 

https://www.youtube....93UvP358&t=796s

 

Next I direct you to the August monthly challenge processing thread. https://www.cloudyni...8208129=&page=5 This is probably the dimmest target ever selected for this challenge. Most imagers struggled with this target, many gave up. It took dark skies, a mono camera and a ton of exposures to show this detail. Here is a link to my finished image. 

 

get.jpg?insecure

 

Now let's compare that to a raw OIII sub exposure that shows the source of the blue part of this nebula. This is stretched about as far as I could to show the detail. This is flipped 180 deg from the finished image but the bright star is the same as in the center of the squid above.   The offset on my camera was 30 so the background levels are just above that. The Mean of this sub is 364 ADU. The brighter pixels are on the faint blue area are around 400 ADU. 

 

Screenshot 2024-12-06 at 11.59.02 AM.jpg

 

To conclude, my 600" sub exposure has levels of only 50 to about 100 ADU above the background levels of my matching Bias/Dark frame. 

 



#14 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 06 December 2024 - 02:22 PM

I will attempt to quantify the above. If you dig around this forum you will find a number of posts that discuss "swamping the read noise". You will also find a formula for that expressed is as follows:

 

We want to exposure long enough to raise the background somewhere between 3x to 10x the read noise squared. We can look up the read noise on this camera. It does depend on gain, but at gain 0 or 100 it is low single digits. The result is around 50 ADU to about 150. We do need to account for the offset, which is easily accomplished by looking at the mean of a bias or dark. 

 

If our Bias has a mean around 500, our target is 50 to 150 ADU above that. Once you achieve the higher end of that range you reach a point of diminishing returns related to increasing SNR. 

 

There unfortunately a practical problem with this approach. Note my 600" exposure above is only at 50 ADU above. Even if I shot 1200 seconds it does not raise the background. In Bortle 2 skies when the moon is down there is no background sky noise to raise this. 

 

An imager in an urban area with a hyper star scope typically only needs 15 or 30 seconds to raise the background much. This creates its own issue in that stacking 1,000 subs can take hours if not days. 

 

The common solution to this is to use the lower gain of 0 with a hyper star scope at f/2. Then monitor clipping instead and not worry about the low end. 


  • fewayne likes this

#15 fewayne

fewayne

    Gemini

  • *****
  • Posts: 3,067
  • Joined: 10 Sep 2017
  • Loc: Madison, WI, USA

Posted 06 December 2024 - 02:35 PM

65k is the bit depth:  2^16.

Although TBC this is just what the software maps your camera's output to. For example, a 12-bit ADC can only produce values from 0-4095, but each value gets multiplied by 16 to achieve the full 0-65535 range. So the range of actual values you might see go 0, 16, 32, etc. when the ADC produces values of 0, 1, and 2.


Edited by fewayne, 06 December 2024 - 02:38 PM.


#16 kg7

kg7

    Vostok 1

  • *****
  • topic starter
  • Posts: 135
  • Joined: 22 Oct 2022
  • Loc: Washington State

Posted 06 December 2024 - 07:24 PM

This has all been helpful, and I clearly have some more reading/youtubing to do.

 

We can look up the read noise on this camera. It does depend on gain, but at gain 0 or 100 it is low single digits. The result is around 50 ADU to about 150. We do need to account for the offset, which is easily accomplished by looking at the mean of a bias or dark. 

This particular sentence doesn't quite make sense yet.  If the read noise of the 2600 at 100 gain is only 1.5, and we want 3x-10x this value squared, isn't that more like 7 to 23 ADU above background?  Where did the 50-150 come from?  Were you just rounding, or am I misunderstanding how the math works?

 

I can certainly solve this problem experimentally by analyzing subs of various lengths for a given scope setup.  But in general it sounds like I should target having the nebulosity regions to be somewhere under 1000 ADU, without clipping too many star pixels at 65k ADU.  Do I need ALL of my nebulosity to be that low, or just the faintest parts I'm trying to capture?

 

And I'll go rewatch the Robin Glover video, which I have definitely seen before.  Unfortunately, my take-home message from my first viewing was "Robin Glover definitely knows more about this topic than I do, so I can probably trust his software's recommendation for exposure and gain settings."  In this case, that appears to have been a bad decision.  I'm sure he'd say I'm just using it wrong.  I suspect that sharpcap's 600s+ recommendation was due to the practical problem you described, trying to raise background levels when shooting from a dark sky site with no moon.  

 

My Heart Nebula subs definitely have some ADU values at or near max in the brightest part of the starless image.  For the purpose of reprocessing this particular data set, what can I do to improve the image?  Would it be a waste to add another two hours of 60s subs and stack them all together to bring down the average for the saturated pixels?


Edited by kg7, 06 December 2024 - 07:29 PM.


#17 idclimber

idclimber

    Cosmos

  • *****
  • Posts: 7,761
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 06 December 2024 - 08:41 PM

This has all been helpful, and I clearly have some more reading/youtubing to do.

 

This particular sentence doesn't quite make sense yet.  If the read noise of the 2600 at 100 gain is only 1.5, and we want 3x-10x this value squared, isn't that more like 7 to 23 ADU above background?  Where did the 50-150 come from?  Were you just rounding, or am I misunderstanding how the math works?

 

I can certainly solve this problem experimentally by analyzing subs of various lengths for a given scope setup.  But in general it sounds like I should target having the nebulosity regions to be somewhere under 1000 ADU, without clipping too many star pixels at 65k ADU.  Do I need ALL of my nebulosity to be that low, or just the faintest parts I'm trying to capture?

 

And I'll go rewatch the Robin Glover video, which I have definitely seen before.  Unfortunately, my take-home message from my first viewing was "Robin Glover definitely knows more about this topic than I do, so I can probably trust his software's recommendation for exposure and gain settings."  In this case, that appears to have been a bad decision.  I'm sure he'd say I'm just using it wrong.  I suspect that sharpcap's 600s+ recommendation was due to the practical problem you described, trying to raise background levels when shooting from a dark sky site with no moon.  

 

My Heart Nebula subs definitely have some ADU values at or near max in the brightest part of the starless image.  For the purpose of reprocessing this particular data set, what can I do to improve the image?  Would it be a waste to add another two hours of 60s subs and stack them all together to bring down the average for the saturated pixels?

Yes I am doing this from memory, and those numbers are rounded.  Somewhere on my hard drive is the spreadsheet I used to calculate those values. The results have been verified by others here. As I recall the actual formula also has to account for e-/ADU which is not one. The formula I got from an old thread and post by Jon Rista. 

 

Yes the issue is those targets are not reasonable for very dark skies and can lead to overexposure. I look at the mean and the background levels when analyzing. 

 

Keep in mind this is not a cliff, where lower than 3x gets you nothing. You simply loose some SNR. The Glover video demonstrates that. 

 

1000 ADU is still above that minimum unless you have an unusually high offset. The default for ZWO on the Air is typically 50 which results in a bias/dark around 500. With that offset a value of 600 for the mean is plenty. 

 

Also keep in mind this does not tell you the ideal. In fact there is a range of nearly equivalent values. It is certainly possible that you could image at 30, 60 and 120" and get the same results assuming the total integration time is the same. 

 

The difference is storage space. The longer subs gets you more efficient storage as there are less files. This is another reason many of us use monitor the high end instead. The last two nights have been really clear and I have been able to finally image the California nebula, which has always eluded me. I guessed and set up my system for 600" subs and monitored the first one to come in. As I recall the number of clipped pixels was around 400 which is well under my tolerance. I therefore let session run without change. 

 

But based on that value I would have been very comfortable extending the exposure to 1200". Not because I needed to, I did not. My background was low with a mean of only 340 on my OIII filter. it was a bit higher on the Ha. Very close to the 3x target I have in my head. 

 

So why not increase? because I only had two forecasted nights and the second night was questionable. If I image two filters (Ha and OIII) over about 10 hrs I would not have enough subs for good rejection. I also know from experience that even if I did double that exposure, that mean ADU is not going to rise much.

 

As to what you do with this data. Learn from it and move on. There are methods to combine, but honestly that would be an advanced skill. One good night with better exposures and you will all the data you need to create a very nice image. If you struggle with that part I and others here will be more than happy to assist. I have done this countless times over the last couple years.




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics