So Frank, I agree with you in principle about changing pixels from their original - "...Any change in a pixel value by a routine that wants to alter the image into something it thinks should be there is by definition an artifact...'' But your statement seems an extreme statement of the principle? For example, we stack sub-frames to achieved a noise reduced result. A pixel is, as such, modified by its treatment (median, average, etc.). What about pixel rejection of aircraft or satellite streaks? For that matter, what about a simple histogram stretch? There are likely many other examples of such routines. These are all routines many of us in astro imaging use regularly.
Where do we draw the line? Many of the routines we use regularly for post-process can be misused by the imager to the point where they create "serious" artifacts. We generally don't reject these routines because the imager misused them. Perhaps you object to the development of deep learning neural network routines? These routines could conceivably lead the inexperienced imager to accept essentially false results as fact.
So again, I agree with you that I would definitely not to want to see these routines mainstreamed and misused on a large scale. Such misuse resulting in object features that are not there. With some who claim to find "new" nonexistent details on mostly static objects whose claims could easily be refuted by professional data/images publicly available.
Thanks for expressing needed caution about potentially over zealous noise reduction and/or sharpening routines.
In astronomical imaging there is a phase of pre-processing that is deterministic and driven by noise models and statistics. And it is almost all happening at the pixel level. Rejecting satellite trails or cosmic rays is part of that noise model, where you reject outliers as expected events in imaging with known causes. And that process amounts to rejecting data that you deem not to be valid - as opposed to selectively changing values based on what you think they should be.
Once you have processed and stacked the exposures you can apply linear and even nonlinear global operations - in order to make the data visible. This has to be done in order for the image to be viewable - and since all operations are global there is no chance to bias or steer the result toward something you want to see - that isn't actually there.
So by 'artifact' I mean any change at that point that is local and based on what is going on in the vicinity of a pixel. Whether it is an artist wanting to make the image look prettier there, or an algorithm using what it has "learned" to fix up a patch of pixels so they look different. There is no noise model involved, it isn't rejecting invalid data, it isn't acting based on what is happening in a single column of pixels, and it is making a local change based on what it has 'learned' from other images.
The above summarizes the rules for image submission in top journals - and a key aspect is that local, selective changes have not been made. Even smoothing or sharpening is discouraged - but as long as it is done in a global way it may be ok. Anything with the word 'adaptive' in it probably isn't.
So I think the above summarizes a very clear line - and it isn't just my line - it is the line drawn by top journals.
If you ignore that line then I think there really is a slippery slope where an image I see that looks nice and compelling - isn't really an "image" - it is an artistic reinterpretation with no holds barred on what was done to create it. It's not so bad if all the processing is spelled out - but with this new stuff I'm concerned where it is headed.
For me, and at least some others, knowing that these additional manipulations have been applied really detracts from the impact an image has - because I don't have a sense I am looking at an actual image captured under the sky.
As I said in my post - I'm not clear where top journals will go with this stuff. It is one thing to use these methods in medical imaging in order to aid segmenting tissue, for example, but for "denoising" astronomical images so they look cleaner - I'm sure where that will go. There are certainly papers on people working in those areas - but I don't know if top journals are allowing it.
Obviously this stuff isn't my cup of tea - but it's still interesting to me and I'd like to see where it goes. And if it does get endorsed as a valid way to "denoise" and even bring out more detail that isn't there - I'd like to hear more about it. And if people want to do this stuff and post examples that's also fine with me. But I sure wish people would post more examples *without* the extra processing and before applying local manipulations. Every image in CN was at one point in a form that met the criteria above - and in terms of seeing how well different setups and techniques capture good data - that would be very informative. And it lets the object in the sky speak for itself.