Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Artifacts of noise reduction?

  • Please log in to reply
13 replies to this topic

#1 acrh2

acrh2

    Ranger 4

  • -----
  • topic starter
  • Posts: 374
  • Joined: 16 Mar 2021

Posted 27 July 2021 - 12:45 PM

Hi.

 

So I was messing around while the moon was out, not really hoping for anything spectacular. Plus the air quality was really bad with all of the wildfires.

 

I shot the Crescent nebula with my Apex Mak @ 1400mm f/11. L-eNhance filter. Bortle 6 sky.

I used ASI533MC Pro @200 gain (unity gain 100,) about 120 x 2 min integration, guided RMS 0.6" error.

 

I think there was just too much noise from light pollution, Moon, bad seeing. So when I used noise reduction in Startools, it introduced this clumpy grain effect in the darkest parts around the nebula. Clearly there's supposed to be red hydrogen gas all around that area, but I think that the noise reduction made it unnaturally clumpy because it was already drowning in all of the noise. You can see it better if you up the gamma in the image or on the monitor.

51339334063_b720130413.jpg

https://live.staticf...37b6fb4bd_o.jpg

 

Is there a way to smooth out this clumpy crap?

Also, should I continue to add more integration time to this, or is it a lost cause?

 

Thanks.

 

 


  • Mike in Rancho likes this

#2 idclimber

idclimber

    Surveyor 1

  • *****
  • Posts: 1,838
  • Joined: 08 Apr 2016
  • Loc: McCall Idaho

Posted 27 July 2021 - 01:00 PM

A lot depends on what program you are using to reduce noise. In Pixinsight I use a luminance mask to protect the brighter areas. I am however working on just the lum channel when doing this with LRGB images. I believe some OSC imagers will create a synthetic Lum channel to work on. 



#3 rj144

rj144

    Viking 1

  • -----
  • Posts: 827
  • Joined: 31 Oct 2020

Posted 27 July 2021 - 01:29 PM

What pixel size did you use in Startools?  Make it smaller.

Did you use Superstructure?

Do you use any other programs to process post Startools?  A quick way to get rid of it now is to use another photo editor and select the nebula, do an inverse selection, and desaturate the red a lot.  


Edited by rj144, 27 July 2021 - 01:32 PM.

  • GoldSpider likes this

#4 Oyaji

Oyaji

    Ranger 4

  • *****
  • Posts: 347
  • Joined: 05 Sep 2018
  • Loc: Central Illinois, USA

Posted 27 July 2021 - 01:30 PM

Why don't you download a free trial copy of Astro Pixel Processor and try it on for size, starting from scratch with your subs? It just might fix your problem--its light pollution removal is particularly good.

I could never get the hang of Startools processing, although I do use it sometimes (after APP) to fix issues that APP doesn't seem to fix very well, such as tightening stars and taming highlights.

Edited by Oyaji, 27 July 2021 - 01:36 PM.


#5 pedxing

pedxing

    Vanguard

  • *****
  • Posts: 2,483
  • Joined: 03 Nov 2009
  • Loc: SE Alaska

Posted 27 July 2021 - 01:31 PM

+1 for using masks when doing noise reduction.

 

Different areas need different amounts of reduction, a mask is a way to achieve that.

 

Additionally, you can use a partial mask over the whole image to moderate the noise reduction intensity, this gives you more control over some of the more aggressive noise reduction algorithms.

 

Not sure how you do that with star tools, but I've found these approaches to be very effective in PI.



#6 rj144

rj144

    Viking 1

  • -----
  • Posts: 827
  • Joined: 31 Oct 2020

Posted 27 July 2021 - 01:41 PM

Here is a VERY quick fix:

 

51339334063-c37b6fb4bd-o.jpg

 

Just selected everything but the nebula and turned down the saturation on the color in those blobs.

 

If processing from scratch, I don't let Startools do a lot of global noise reduction.  After Startools, I use Affinity and Starnet (for nebula) to isolate the background and stars from the nebula and process each differently.  That helps alot.  Usually the noise is in the background, and less in the nebula.


Edited by rj144, 27 July 2021 - 01:42 PM.

  • brlasy1 likes this

#7 Mike in Rancho

Mike in Rancho

    Apollo

  • -----
  • Posts: 1,411
  • Joined: 15 Oct 2020
  • Loc: Alta Loma, CA

Posted 27 July 2021 - 03:45 PM

There are ways to squash it down or out, and yes selective manipulation can be done even in ST.  I like to avoid it, but sometimes, break glass in case of emergency, right?

 

I deal with it a fair amount too, as I often image at f/9.  And my skies are brighter, but you had smoke.  More integration helps to fill in the holes and, ultimately, smooth out fainter nebulous areas.

 

Unless something is drastically wrong, nothing wrong with adding more time.  Are you getting sufficient dithers, random and not spiral?

 

If you post the stack we can take a stab at things.

 

There are ways of trying to keep that in check, however.  Don't over-Wipe if you don't have to.  Don't overstretch beyond what this level of data can handle.  Sure the target will brighten up, but so will all the junk.  Reduce the dynamic range allocated to stuff in the shadows.  Bin the image down to the scale the data can support, also thereby recovering some SNR.  The 533 probably won't be as oversampled as a DSLR, but you can still probably recover some.  Don't overdo HDR, Sharpening, or Deconvolution if it is brightening or creating further separation and blotchiness in those fainter areas.  Try different presets and settings in SS, and in the final denoise, again with a mind to prevent clumpiness.  You can also retain a certain level of grain, which sounds a little counterintuitive, but may result in a better overall image.  Finally, as a final step you can take the image into FilmDev where you can make adjustments to gamma and skyglow (among others), which can help reel in any left-sided gap you may have in the histogram, or make the histogram more vertical, the right balance of which should allow retention of your target while minimizing any of the unsightly areas.

 

All that said, if you are viewing on a monitor with brightness or gamma jacked to the moon (also often, mobile device displays), an awful lot of images will show junk in the background, even though they look fine and dandy on normal or calibrated monitors.  I've actually started making second versions of things for my phone's screen.


  • Ivo Jager, rj144 and acrh2 like this

#8 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 522
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 27 July 2021 - 09:54 PM

Hi.

 

So I was messing around while the moon was out, not really hoping for anything spectacular. Plus the air quality was really bad with all of the wildfires.

 

I shot the Crescent nebula with my Apex Mak @ 1400mm f/11. L-eNhance filter. Bortle 6 sky.

I used ASI533MC Pro @200 gain (unity gain 100,) about 120 x 2 min integration, guided RMS 0.6" error.

 

I think there was just too much noise from light pollution, Moon, bad seeing. So when I used noise reduction in Startools, it introduced this clumpy grain effect in the darkest parts around the nebula. Clearly there's supposed to be red hydrogen gas all around that area, but I think that the noise reduction made it unnaturally clumpy because it was already drowning in all of the noise. You can see it better if you up the gamma in the image or on the monitor.
 

A few things.

 

#1

 

On how (shot) noise comes about and how to deal with it;

Shot noise (ideally the only source of noise you should be concerned about) is caused by the randomness of a photon arriving and successfully being converted into an electron count. It is an inextricable part of your signal; signal and its noise component come part and parcel. It is important to note that shot noise is not added (e.g. there is not one fixed level per-pixel), rather it is applied and dependent on the incoming signal (e.g. if no signal comes in, no shot noise exists either).

 

If you applying any operation to your signal, then the same operation is applied to the noise component as well. Stretch a pixel? The noise becomes stretched along with it. Perform local stretching (local dynamic range optimisation) or, worse, advanced stuff like deconvolution and the noise component mutates in a completely irrecoverable, irreversible way.

 

Why is the latter important? It means that using masks, selective processing or background "isolation" to deal with noise are kludges at best, bordering on the nonsensical (from a sound signal processing point of view) at worst. Some sort of blur filter throttled by a luminance mask (the mask mimicking the applied global stretch but little else) is indeed probably the best you can do in a traditional application like PI, PS, The GIMP, etc.

 

StarTools, OTOH, tracks the entire noise component re-shaping history across your image until the very end. At the end of your workflow, it knows precisely, per-pixel, how prevalent the noise is visually. Note that I say "visually", as this is based on the assumption that the image is viewed on a well-calibrated monitor.

 

Not using the noise reduction or tracked history to its fullest potential, is essentially akin to not using 75% of StarTools' code and years of R&D. Similarly, using StarTools with data that has already been meddled with (color balanced, gradient removal, any sort of processing) will already put you on the backfoot; noise component tracking needs to start at the earliest point in time; the time when the signal and its noise component is as unadulterated as possible. You should not need any noise reduction after StarTools. If you do (or think you do), I would love to have a look at your dataset, as that would constitute an anomaly worth investigating.

 

#2

 

Shot noise exists at all scales, its prevalence is just attenuated the larger the noise grain becomes. StarTools uses a wavelet-based noise reduction routine, meaning it allows you to throttle noise grain prevalence on different scales (e.g. different sizes of noise grain).

 

Attenuate just the small scale noise, and you are left with only the big stuff. This is what is happening in your current image. Attenuate the big stuff also, and you should be left with a clean image.

 

There is a reason it says this in the Unified De-noise module;

 

Selection_643.jpg

 

If you see remaining large-scale grain ("clumps"), the Grain Size (and, in the subsequent screen, "Grain Dispersion") is set too low. You can also bump up the largest scale (Scale 5) to suppress more large-scale grain as a last resort.  Give the latest 1.8 alpha a try, which comes with enhancements to let you deal with large scale grain even better.

 

#3

 

The human eye is drawn to coherence and patterns. Leaving noise grain removal unbalanced across different scales will draw the attention to the scale that shows grain the most. As Mike says, as a last resort, try reintroducing a little bit of grain (the "Equalized Grain parameter) to draw the eye away from one particular scale.

 

Any issues, do let me know. I hope this helps, but please feel free to share the dataset with me if you think it might help!


Edited by Ivo Jager, 27 July 2021 - 09:55 PM.

  • happylimpet, Mike in Rancho and acrh2 like this

#9 acrh2

acrh2

    Ranger 4

  • -----
  • topic starter
  • Posts: 374
  • Joined: 16 Mar 2021

Posted 28 July 2021 - 09:53 AM

A few things.

 

#1

 

On how (shot) noise comes about and how to deal with it;

Shot noise (ideally the only source of noise you should be concerned about) is caused by the randomness of a photon arriving and successfully being converted into an electron count. It is an inextricable part of your signal; signal and its noise component come part and parcel. It is important to note that shot noise is not added (e.g. there is not one fixed level per-pixel), rather it is applied and dependent on the incoming signal (e.g. if no signal comes in, no shot noise exists either).

 

If you applying any operation to your signal, then the same operation is applied to the noise component as well. Stretch a pixel? The noise becomes stretched along with it. Perform local stretching (local dynamic range optimisation) or, worse, advanced stuff like deconvolution and the noise component mutates in a completely irrecoverable, irreversible way.

 

Why is the latter important? It means that using masks, selective processing or background "isolation" to deal with noise are kludges at best, bordering on the nonsensical (from a sound signal processing point of view) at worst. Some sort of blur filter throttled by a luminance mask (the mask mimicking the applied global stretch but little else) is indeed probably the best you can do in a traditional application like PI, PS, The GIMP, etc.

 

StarTools, OTOH, tracks the entire noise component re-shaping history across your image until the very end. At the end of your workflow, it knows precisely, per-pixel, how prevalent the noise is visually. Note that I say "visually", as this is based on the assumption that the image is viewed on a well-calibrated monitor.

 

Not using the noise reduction or tracked history to its fullest potential, is essentially akin to not using 75% of StarTools' code and years of R&D. Similarly, using StarTools with data that has already been meddled with (color balanced, gradient removal, any sort of processing) will already put you on the backfoot; noise component tracking needs to start at the earliest point in time; the time when the signal and its noise component is as unadulterated as possible. You should not need any noise reduction after StarTools. If you do (or think you do), I would love to have a look at your dataset, as that would constitute an anomaly worth investigating.

 

#2

 

Shot noise exists at all scales, its prevalence is just attenuated the larger the noise grain becomes. StarTools uses a wavelet-based noise reduction routine, meaning it allows you to throttle noise grain prevalence on different scales (e.g. different sizes of noise grain).

 

Attenuate just the small scale noise, and you are left with only the big stuff. This is what is happening in your current image. Attenuate the big stuff also, and you should be left with a clean image.

 

There is a reason it says this in the Unified De-noise module;

 

attachicon.gifSelection_643.jpg

 

If you see remaining large-scale grain ("clumps"), the Grain Size (and, in the subsequent screen, "Grain Dispersion") is set too low. You can also bump up the largest scale (Scale 5) to suppress more large-scale grain as a last resort.  Give the latest 1.8 alpha a try, which comes with enhancements to let you deal with large scale grain even better.

 

#3

 

The human eye is drawn to coherence and patterns. Leaving noise grain removal unbalanced across different scales will draw the attention to the scale that shows grain the most. As Mike says, as a last resort, try reintroducing a little bit of grain (the "Equalized Grain parameter) to draw the eye away from one particular scale.

 

Any issues, do let me know. I hope this helps, but please feel free to share the dataset with me if you think it might help!

Thank you for a detailed answer.

Long story short, I think there may be more to these "artifacts" from noise reduction than I initially thought.

 

When I compare my image in this thread to my own image at f/4.8 (f/11-200 gain-2 min subs vs. f/4.8-100 gain-4 min subs), which is 4 times brighter,

https://live.staticf...8e8da6691_o.jpg

 

Or to a NASA APOD image

https://apod.nasa.go...dFinalimage.jpg

 

I can see the same clumps of gas. Less in f/4.8 and much less in the APOD image, but the seeds of clumps are there.

Perhaps that region of space has gas density variations.

I think the fact that the original image is very noisy exacerbates this clumpiness during noise reduction.

 

Still, since you are already here, I would love to pick your brain on Startools processing.

Maybe you can give me some pointers?

-------------------------------------

 

I think what I did was the following.

 

1) Photometric color calibration in Siril. Then Startools.

2) 2x2 Bin followed by Crop.

3) No Wipe (hence the the Siril calibration.)

 

I am sorry, I just don't have a very good understanding how Wipe works, or how it can even work on nebula photos, where there is signal from interstellar gas everywhere. I am a reasonably intelligent person (I am a PhD researcher,) and I've read the manuals and guides, and I still have no intuitive understanding what constitutes a good Wipe, and if I wipe out some image detail or not when I use it.

</end rant>

 

4) Autodev, with a small POI centered on the brightest 10% of the nebula. That part of the nebula gets blown out and loses detail if I chose a different POI.

5) Sharp followed by Shrink. Default settings.

6)  Then noise reduction. I think I used grain size of 10, brightness detail loss of 65 and scale correlation of 40.

7) StarNet++, and then I used Gimp to tweak the image brightness, colors, saturation etc. 

 

Do you see any immediate issues?

 

BTW, here's a link to the data set.

Open on an empty stomach.

https://drive.google...iew?usp=sharing

 

Thanks.



#10 GoldSpider

GoldSpider

    Apollo

  • *****
  • Posts: 1,213
  • Joined: 20 Apr 2015
  • Loc: Pennsylvania

Posted 28 July 2021 - 10:49 AM

What pixel size did you use in Startools?  Make it smaller.

Did you use Superstructure?

Do you use any other programs to process post Startools?  A quick way to get rid of it now is to use another photo editor and select the nebula, do an inverse selection, and desaturate the red a lot.  

This.  I only use 2 or 1.5 as my grain size in the noise reduction step for exactly that reason.  Any higher and it looks "clumpy" to me.



#11 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 522
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 28 July 2021 - 08:43 PM

Thank you for sharing your dataset. It is very helpful.

The clumpiness is a real feature of your dataset. The unevenness is there from the very beginning. An initial AutoDev (straight after loading) shows this;

 

crescent_ad.jpg

 

I think the source of your large "clumpiness" is the quality of your flats and not dithering properly - I can see a fair bit unevenness in your background, including some  clearly identifiable dust donuts.

 

As you progress (and create better quality datasets) you should start appreciating the ability of programs to retain such faint features, rather than not preserving them. A good noise reduction routine will reduce the noise in/around these "features" but will not remove the features themselves. For now, the only advice I would have for unceremoniously "hiding" features that are not real, is to simply stretch the background less (try the Shadow Linearity in AutoDev).

 

In short, the problem you are dealing with, is not noise - it is poor calibration.

 

In addition (but unrelated), small-scale noise grain is multi-pixel in size, showing up as strings/worms and little clumps. The latter also impacts how well noise reduction and detail enhancement routines will be able to function.

 


I think what I did was the following.

 

1) Photometric color calibration in Siril. Then Startools.

2) 2x2 Bin followed by Crop.

3) No Wipe (hence the the Siril calibration.)

Never do any sort of processing in any application prior to opening your dataset in StarTools. For example, don't do color calibration in Siril. Color balancing distorts noise levels and makes noise bleed into other channels. (and PCC on L-Enhance filtered datasets doesn't make much sense to begin with!)

 

Wipe is mandatory. It should never be skipped. See here for what a recommended workflow looks like and what is (M)andatory and (S)uggested.

 

 

I am sorry, I just don't have a very good understanding how Wipe works, or how it can even work on nebula photos, where there is signal from interstellar gas everywhere. I am a reasonably intelligent person (I am a PhD researcher,) and I've read the manuals and guides, and I still have no intuitive understanding what constitutes a good Wipe, and if I wipe out some image detail or not when I use it.

Hmmm... The documentation actually delves into this specifically (see "Sample revocation" and "Design philosophy and limitations")?

It details how Wipe was designed specifically to avoid the tendency of most other gradient removal tools to destroy faint nebulosity. Most/all images don't have an an "empty" background; there is always faint signal there. Ergo, where other algorithms force you to put samples on that not-quite-empty background, Wipe does not. It uses robust, non-subjective, algorithmic analysis and reconstruction instead.

 

Wipe creates gradient models based on undulation frequency exclusion; it makes sure detail that undulates faster than a specific frequency is not considered for the background model, while minima in slower undulating detail may be considered.
 

4) Autodev, with a small POI centered on the brightest 10% of the nebula. That part of the nebula gets blown out and loses detail if I chose a different POI.
5) Sharp followed by Shrink. Default settings.
6)  Then noise reduction. I think I used grain size of 10, brightness detail loss of 65 and scale correlation of 40.

7) StarNet++, and then I used Gimp to tweak the image brightness, colors, saturation etc.
Do you see any immediate issues?

Don't use other applications after StarTools if you don't have to. Learning how to do everything within StarTools will allow Tracking to keep... track of what you're doing and how it affected noise. If you're new to astrophotographical signal processing, it also helps making sure the commutative property is respected in the signal flow (for example, most color manipulations after stretching make no sense mathematically or photographically).

 

Avoid Starnet in particular in your workflow (unless you wish to present starless images of course); it just introduces artifacts and taints the documentary value of your image (it is usually easy to tell when it was used).

 

The stellar profiles of your stars show some aberrations in stars that have over-exposing cores, causing "rings". This can be explained by stacking the result of frames that had the atmosphere vary significantly (a tell-tale sign is usually when stars have large "halos" from the start). A hazy atmosphere can certainly explain this.

 

Open on an empty stomach.

All data is worthy! As long as it is properly calibrated. But shot noise should be your only problem. If you go deeper and wish to show the faintest of nebulosity, you need to be able to trust the data you acquired and stacked. Right now humans, nor algorithms can trust this data to the point where the only uncertainty in the signal is the shot noise.

 

All in all, (i think) all the pitfalls mentioned above are highlighted here;

https://www.startool...g-dos-and-donts

Try addressing these to the best of your abilities.

 

Respecting best practices (fortunately the dataset seems to be just stacked and not meddled with), you actually have enough signal to, for example, use deconvolution and restore detail;

 

Selection_645sbs.jpg

 

Hope this helps!


Edited by Ivo Jager, 28 July 2021 - 08:46 PM.

  • happylimpet, 42itous1, Mike in Rancho and 1 other like this

#12 acrh2

acrh2

    Ranger 4

  • -----
  • topic starter
  • Posts: 374
  • Joined: 16 Mar 2021

Posted 29 July 2021 - 05:07 PM

Thank you for sharing your dataset. It is very helpful.

The clumpiness is a real feature of your dataset. The unevenness is there from the very beginning. An initial AutoDev (straight after loading) shows this;

 

I think the source of your large "clumpiness" is the quality of your flats and not dithering properly - I can see a fair bit unevenness in your background, including some  clearly identifiable dust donuts.

 

As you progress (and create better quality datasets) you should start appreciating the ability of programs to retain such faint features, rather than not preserving them. A good noise reduction routine will reduce the noise in/around these "features" but will not remove the features themselves. For now, the only advice I would have for unceremoniously "hiding" features that are not real, is to simply stretch the background less (try the Shadow Linearity in AutoDev).

 

In short, the problem you are dealing with, is not noise - it is poor calibration.

 

In addition (but unrelated), small-scale noise grain is multi-pixel in size, showing up as strings/worms and little clumps. The latter also impacts how well noise reduction and detail enhancement routines will be able to function.

 

Never do any sort of processing in any application prior to opening your dataset in StarTools. For example, don't do color calibration in Siril. Color balancing distorts noise levels and makes noise bleed into other channels. (and PCC on L-Enhance filtered datasets doesn't make much sense to begin with!)

 

Wipe is mandatory. It should never be skipped. See here for what a recommended workflow looks like and what is (M)andatory and (S)uggested.

 

Hmmm... The documentation actually delves into this specifically (see "Sample revocation" and "Design philosophy and limitations")?

It details how Wipe was designed specifically to avoid the tendency of most other gradient removal tools to destroy faint nebulosity. Most/all images don't have an an "empty" background; there is always faint signal there. Ergo, where other algorithms force you to put samples on that not-quite-empty background, Wipe does not. It uses robust, non-subjective, algorithmic analysis and reconstruction instead.

 

Wipe creates gradient models based on undulation frequency exclusion; it makes sure detail that undulates faster than a specific frequency is not considered for the background model, while minima in slower undulating detail may be considered.
 

Don't use other applications after StarTools if you don't have to. Learning how to do everything within StarTools will allow Tracking to keep... track of what you're doing and how it affected noise. If you're new to astrophotographical signal processing, it also helps making sure the commutative property is respected in the signal flow (for example, most color manipulations after stretching make no sense mathematically or photographically).

 

Avoid Starnet in particular in your workflow (unless you wish to present starless images of course); it just introduces artifacts and taints the documentary value of your image (it is usually easy to tell when it was used).

 

The stellar profiles of your stars show some aberrations in stars that have over-exposing cores, causing "rings". This can be explained by stacking the result of frames that had the atmosphere vary significantly (a tell-tale sign is usually when stars have large "halos" from the start). A hazy atmosphere can certainly explain this.

 

All data is worthy! As long as it is properly calibrated. But shot noise should be your only problem. If you go deeper and wish to show the faintest of nebulosity, you need to be able to trust the data you acquired and stacked. Right now humans, nor algorithms can trust this data to the point where the only uncertainty in the signal is the shot noise.

 

All in all, (i think) all the pitfalls mentioned above are highlighted here;

https://www.startool...g-dos-and-donts

Try addressing these to the best of your abilities.

 

Respecting best practices (fortunately the dataset seems to be just stacked and not meddled with), you actually have enough signal to, for example, use deconvolution and restore detail;

 

 

Hope this helps!

 

Thank you for a very detailed response. 

Here are my thoughts on this issue.

Thank you for sharing your dataset. It is very helpful.

The clumpiness is a real feature of your dataset. The unevenness is there from the very beginning. An initial AutoDev (straight after loading) shows this;

I think the source of your large "clumpiness" is the quality of your flats and not dithering properly - I can see a fair bit unevenness in your background, including some  clearly identifiable dust donuts.

 

Guilty as charged on the flats. I thought that I could get away without flats because my sensor is small, and because I clean the optics often to get rid of the dust. Clearly not enough this time. Would you recommend me using a Galaxy Tab S2 tablet as a flat panel vs. not using flats at all? it's got a nice amoled screen, and I can get a sheet of white plastic to attenuate the light intensity and distribution.

 

I have no idea what you mean by "not dithering properly." I set it to dither every 2 frames for 2 min subs and every frame for 4 min subs. The dithering step is up to 25 imaging pixels, the pattern is random. What else can I do?

 

As you progress (and create better quality datasets) you should start appreciating the ability of programs to retain such faint features, rather than not preserving them. A good noise reduction routine will reduce the noise in/around these "features" but will not remove the features themselves. For now, the only advice I would have for unceremoniously "hiding" features that are not real, is to simply stretch the background less (try the Shadow Linearity in AutoDev).

 

I'll try that. Thanks.

 

In short, the problem you are dealing with, is not noise - it is poor calibration.

In addition (but unrelated), small-scale noise grain is multi-pixel in size, showing up as strings/worms and little clumps. The latter also impacts how well noise reduction and detail enhancement routines will be able to function.

Never do any sort of processing in any application prior to opening your dataset in StarTools. For example, don't do color calibration in Siril. Color balancing distorts noise levels and makes noise bleed into other channels. (and PCC on L-Enhance filtered datasets doesn't make much sense to begin with!)

Wipe is mandatory. It should never be skipped. See here for what a recommended workflow looks like and what is (M)andatory and (S)uggested.

 

Got it.

 

Hmmm... The documentation actually delves into this specifically (see "Sample revocation" and "Design philosophy and limitations")?

It details how Wipe was designed specifically to avoid the tendency of most other gradient removal tools to destroy faint nebulosity. Most/all images don't have an an "empty" background; there is always faint signal there. Ergo, where other algorithms force you to put samples on that not-quite-empty background, Wipe does not. It uses robust, non-subjective, algorithmic analysis and reconstruction instead.

 

Wipe creates gradient models based on undulation frequency exclusion; it makes sure detail that undulates faster than a specific frequency is not considered for the background model, while minima in slower undulating detail may be considered.

 

This is where we are going to have to disagree - the manual. 

Here's a quote from it: "Wipe discerns gradient from real detail by estimating undulation frequency. In a nut shell, real detail tends to change rapidly from pixel to pixel, whereas gradients do not. The 'Aggressiveness' specifies the undulation threshold, whereby higher 'Aggressiveness' settings latch on to ever faster undulating gradients. "

 

With all due respect, this is mumbo-jumbo. Aggressiveness seems to be one of the most important settings in Wipe, and it produces WILDLY different results with even the slightest modification, yet there's absolutely no way that I found to determine if any of those outcomes are better than the other. And the excerpt from the manual doesn't help at all. It would be a great help if you could actually make a video of how to use Wipe properly. 

Also, these very technical terms and phrases, which mean absolutely nothing to someone who has never seen anything outside of Microsoft Paint, appear in the manual often, in other modules as well. And there are almost no videos on youtube that actually explain the modules. I've seen all of the videos on Startools, and I've read the manual and many guides, and I am still no closer to understanding how some of the sliders affect the image, and if one outcome is better than the other.

 

Don't use other applications after StarTools if you don't have to. Learning how to do everything within StarTools will allow Tracking to keep... track of what you're doing and how it affected noise. If you're new to astrophotographical signal processing, it also helps making sure the commutative property is respected in the signal flow (for example, most color manipulations after stretching make no sense mathematically or photographically).

Avoid Starnet in particular in your workflow (unless you wish to present starless images of course); it just introduces artifacts and taints the documentary value of your image (it is usually easy to tell when it was used).

 

I would love to master Startools. I've spent countless hours playing with same datasets, and I still feel like I know nothing. Teach me, sensei!

 

The stellar profiles of your stars show some aberrations in stars that have over-exposing cores, causing "rings". This can be explained by stacking the result of frames that had the atmosphere vary significantly (a tell-tale sign is usually when stars have large "halos" from the start). A hazy atmosphere can certainly explain this.

All data is worthy! As long as it is properly calibrated. But shot noise should be your only problem. If you go deeper and wish to show the faintest of nebulosity, you need to be able to trust the data you acquired and stacked. Right now humans, nor algorithms can trust this data to the point where the only uncertainty in the signal is the shot noise.

 

Yeah, on top of not having flats, these were the worst possible conditions, and the target wasn't the greatest either. So I wasn't expecting much to begin with. The fact that I got something is, well, something. Would love to improve it though.

 

 

All in all, (i think) all the pitfalls mentioned above are highlighted here;

https://www.startool...g-dos-and-donts

Try addressing these to the best of your abilities.

Respecting best practices (fortunately the dataset seems to be just stacked and not meddled with), you actually have enough signal to, for example, use deconvolution and restore detail;

 

Hope this helps!

 

 

Thank you, sir. I will take your advice.



#13 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 522
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 29 July 2021 - 11:30 PM

Guilty as charged on the flats. I thought that I could get away without flats because my sensor is small, and because I clean the optics often to get rid of the dust. Clearly not enough this time. Would you recommend me using a Galaxy Tab S2 tablet as a flat panel vs. not using flats at all? it's got a nice amoled screen, and I can get a sheet of white plastic to attenuate the light intensity and distribution.

Flats are really not optional unfortunately... The tablet might work, though the "white T-shirt" method is a tried and tested method as well.

 

 

I have no idea what you mean by "not dithering properly." I set it to dither every 2 frames for 2 min subs and every frame for 4 min subs. The dithering step is up to 25 imaging pixels, the pattern is random. What else can I do?

 

Hmmm... That's really strange. As you can see noise is not confined to 1 pixel but is clearly correlating with neighboring pixels (stringy/wormy appearance). 

Selection_657.jpg

 

This should not be occurring with sufficiently dithered data. undecided.gif

 

 

 

This is where we are going to have to disagree - the manual.

Here's a quote from it: "Wipe discerns gradient from real detail by estimating undulation frequency. In a nut shell, real detail tends to change rapidly from pixel to pixel, whereas gradients do not. The 'Aggressiveness' specifies the undulation threshold, whereby higher 'Aggressiveness' settings latch on to ever faster undulating gradients. "

With all due respect, this is mumbo-jumbo.

 

 

This is a (very) high level allusion to the concept of wavelet decomposition, and how Wipe seperates detail at different scales (large scale = gradient, small scale = celestial detail).

 

 

 

Aggressiveness seems to be one of the most important settings in Wipe, and it produces WILDLY different results with even the slightest modification,

 

 

Hmmm... that sounds like a bug...

The Aggressiveness parameter should work gradually and reliably.

 

 

yet there's absolutely no way that I found to determine if any of those outcomes are better than the other.

 

Perhaps the diagnostics stretch is throwing you off? It will relentlessly keep showing you remaining "issues" with your image and will stretch your data into oblivion to show them.

 

Does this post on the forums help?

 

 

Also, these very technical terms and phrases, which mean absolutely nothing to someone who has never seen anything outside of Microsoft Paint, appear in the manual often, in other modules as well.

 

 

Unfortunately, at some stage, technical terms and phrases are a necessity. The manual tries to explain how parameters in StarTools relate to common concepts in signal and image processing. To truly understand what a parameter governs, more background reading will be necessary. This is a bit outside the scope of a manual for software that implements these concepts into a coherent whole. Your are unfortunately expected to bring some knowledge along to understand what a module does and why. I try my hardest not to force this onto a user by allowing you to get by even if "gamma", "wavelet"  or "gradient undulation" means nothing to you.

 

Of course, if it is still unclear, even with the requisite background knowledge, what a parameter does or how a module works in broad lines, then I'm definitely failing.

 

Look, I feel your pain - I really do. I hate how trick/difficult this stuff is for newbies. The learning curve is incredibly steep. It has led me to put up stuff like the "starting with a good dataset" section. It's really just general advice for achieving clean stacks, ISO settings, even guides for other software that has nothing to do with StarTools perse.

 

I would love to master Startools. I've spent countless hours playing with same datasets, and I still feel like I know nothing. Teach me, sensei!

 

At the end of the day, StarTools is just a tool. It just does what the operator tells it to. It cannot teach you what you need to know for effective acquisition and image processing. It only facilitates the process.

 

Endeavor to understand how a photon travels all the way to a pixel in a finished images; what happens to it along the way, and why the various best practices and algorithms exist, to make that happen.

 

Once you understand that journey a photon->electron->datanumber->pixel-on-a-screen takes, it becomes easy, even "zen" to process images; you will know exactly what next step to take and why. If using StarTools for that, you will typically find you can leave most parameters alone, but first you must understand why these parameters exist in the first place.

 

Sometimes a good culture shock can help. You could, for example, try PixInsight. It is a fantastic way to be forced to understand what you are doing and why for every single module/script in a traditional input->output->input->output application.

 

Once you understand what is happening to signal, getting your head around (and appreciating) StarTools' modules and engine innovations will be so much easier and revelatory.


  • Mike in Rancho likes this

#14 Mike in Rancho

Mike in Rancho

    Apollo

  • -----
  • Posts: 1,411
  • Joined: 15 Oct 2020
  • Loc: Alta Loma, CA

Posted 30 July 2021 - 02:00 AM

It takes a while acrh2.  I think I'm 9 months into ST, and still coming to grips with a lot of stuff.  And there's more I haven't even tried yet.  And even though it processes in a bit of a different manner, you still need to learn all the language of image and photon processing, same as PI or PS users.

 

In the beginning I was just moving sliders around, and with the defaults and a few nudges you can often come up with a fair result (depending on just how jacked the data is, of course).  But with more time spent you build experience and start to see how the data is likely to react.

 

That said as you can see, I also chimed in on that Wipe thread Ivo linked, and have been starting to try to use it less aggressively.  Fix the gradients, leftover calibration problems, and aim for an even field in areas of no actual detail, rather than try to use Wipe for noise removal.  Then AutoDev stretch to the point the data can handle it, and don't stretch out and emphasize your background noise.

 

A lot of times I'll have the PDF of the manual open at the same time I am processing, to help understand what various controls are actually doing.  It's a pretty helpful manual, and is organized and bookmarked fairly well.

 

A few things on this particular data - 4 hours isn't necessarily a whole lot at f/11, particularly for the more faint areas surrounding the Crescent.  I think I took 3 hours on it, at my f/9 and brighter sky, and also ended up thin on the outer and fainter detail.  So more time may help.

 

Maybe also restack the data using ST guidelines.  No photometric whatever.  I don't even know how that would work when the data is just OIII and Ha lines anyway, but my guess is it will mangle the color channel balance.  And that means that you really can't properly utilize the middle file opening option and process through in a synthetic luminance, the way one should.  Maybe we could take a look at a new stack then.  Also consider tightening the kappa rejection if maybe too much stray junk was getting through into the final stack?

 

Another possibility, once in ST, is to bin down even more, say the 25 or 35 option (or custom bin %), and perhaps forego deconvolution because of the scale.  Unfortunately 3K x 3K pixels only has so much headroom for doing that.  But it might also help with any stringiness or patchiness.


Edited by Mike in Rancho, 30 July 2021 - 02:02 AM.

  • Ivo Jager likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics