Thank you for sharing your dataset. It is very helpful.
The clumpiness is a real feature of your dataset. The unevenness is there from the very beginning. An initial AutoDev (straight after loading) shows this;
I think the source of your large "clumpiness" is the quality of your flats and not dithering properly - I can see a fair bit unevenness in your background, including some clearly identifiable dust donuts.
As you progress (and create better quality datasets) you should start appreciating the ability of programs to retain such faint features, rather than not preserving them. A good noise reduction routine will reduce the noise in/around these "features" but will not remove the features themselves. For now, the only advice I would have for unceremoniously "hiding" features that are not real, is to simply stretch the background less (try the Shadow Linearity in AutoDev).
In short, the problem you are dealing with, is not noise - it is poor calibration.
In addition (but unrelated), small-scale noise grain is multi-pixel in size, showing up as strings/worms and little clumps. The latter also impacts how well noise reduction and detail enhancement routines will be able to function.
I think what I did was the following.
1) Photometric color calibration in Siril. Then Startools.
2) 2x2 Bin followed by Crop.
3) No Wipe (hence the the Siril calibration.)
Never do any sort of processing in any application prior to opening your dataset in StarTools. For example, don't do color calibration in Siril. Color balancing distorts noise levels and makes noise bleed into other channels. (and PCC on L-Enhance filtered datasets doesn't make much sense to begin with!)
Wipe is mandatory. It should never be skipped. See here for what a recommended workflow looks like and what is (M)andatory and (S)uggested.
I am sorry, I just don't have a very good understanding how Wipe works, or how it can even work on nebula photos, where there is signal from interstellar gas everywhere. I am a reasonably intelligent person (I am a PhD researcher,) and I've read the manuals and guides, and I still have no intuitive understanding what constitutes a good Wipe, and if I wipe out some image detail or not when I use it.
Hmmm... The documentation actually delves into this specifically (see "Sample revocation" and "Design philosophy and limitations")?
It details how Wipe was designed specifically to avoid the tendency of most other gradient removal tools to destroy faint nebulosity. Most/all images don't have an an "empty" background; there is always faint signal there. Ergo, where other algorithms force you to put samples on that not-quite-empty background, Wipe does not. It uses robust, non-subjective, algorithmic analysis and reconstruction instead.
Wipe creates gradient models based on undulation frequency exclusion; it makes sure detail that undulates faster than a specific frequency is not considered for the background model, while minima in slower undulating detail may be considered.
4) Autodev, with a small POI centered on the brightest 10% of the nebula. That part of the nebula gets blown out and loses detail if I chose a different POI.
5) Sharp followed by Shrink. Default settings.
6) Then noise reduction. I think I used grain size of 10, brightness detail loss of 65 and scale correlation of 40.
7) StarNet++, and then I used Gimp to tweak the image brightness, colors, saturation etc.
Do you see any immediate issues?
Don't use other applications after StarTools if you don't have to. Learning how to do everything within StarTools will allow Tracking to keep... track of what you're doing and how it affected noise. If you're new to astrophotographical signal processing, it also helps making sure the commutative property is respected in the signal flow (for example, most color manipulations after stretching make no sense mathematically or photographically).
Avoid Starnet in particular in your workflow (unless you wish to present starless images of course); it just introduces artifacts and taints the documentary value of your image (it is usually easy to tell when it was used).
The stellar profiles of your stars show some aberrations in stars that have over-exposing cores, causing "rings". This can be explained by stacking the result of frames that had the atmosphere vary significantly (a tell-tale sign is usually when stars have large "halos" from the start). A hazy atmosphere can certainly explain this.
Open on an empty stomach.
All data is worthy! As long as it is properly calibrated. But shot noise should be your only problem. If you go deeper and wish to show the faintest of nebulosity, you need to be able to trust the data you acquired and stacked. Right now humans, nor algorithms can trust this data to the point where the only uncertainty in the signal is the shot noise.
All in all, (i think) all the pitfalls mentioned above are highlighted here;
Try addressing these to the best of your abilities.
Respecting best practices (fortunately the dataset seems to be just stacked and not meddled with), you actually have enough signal to, for example, use deconvolution and restore detail;
Hope this helps!
Edited by Ivo Jager, 28 July 2021 - 08:46 PM.