Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Lunar images from November 27, 2018, Apollo 11 landing site and more

  • Please log in to reply
44 replies to this topic

#26 MADRID SKY

MADRID SKY

    Viking 1

  • *****
  • Posts: 680
  • Joined: 20 Mar 2007
  • Loc: Madrid, SPAIN

Posted 06 December 2018 - 02:59 PM

Tareq: boy , be careful with your ASI cameras... :) I envy your desertic weather conditions in Arabia... you are from Arabia, don't you? 

 

Tom: thank you for the trick in AS3! That's so kind of you. I will try and see in the coming months... :) What I don't get is that huge amount of frames that you stack. WIth the Moon I find that counter-productive. You usually want to stack a lot of frames to get better signal to noise in low contrast targets... but the moon should be all the contrary. You just use as less frames as possible to "freeze" the seeing and get your nice result. The more frames in your stack, the worse and more artifacts you get. With proper capture set up, I don't think it wise to get more than 5-10 pic stacks. I really don't understand the point. I capture at 1ms all the time with the moon... and I just get 2-5 frames only!! When you say you stack 50-100 I see that a monstruosity number of frames... What's the deal?



#27 TareqPhoto

TareqPhoto

    Skylab

  • -----
  • Posts: 4265
  • Joined: 20 Feb 2017
  • Loc: Ajman - UAE

Posted 06 December 2018 - 03:26 PM

Tareq: boy , be careful with your ASI cameras... smile.gif I envy your desertic weather conditions in Arabia... you are from Arabia, don't you? 

 

Tom: thank you for the trick in AS3! That's so kind of you. I will try and see in the coming months... smile.gif What I don't get is that huge amount of frames that you stack. WIth the Moon I find that counter-productive. You usually want to stack a lot of frames to get better signal to noise in low contrast targets... but the moon should be all the contrary. You just use as less frames as possible to "freeze" the seeing and get your nice result. The more frames in your stack, the worse and more artifacts you get. With proper capture set up, I don't think it wise to get more than 5-10 pic stacks. I really don't understand the point. I capture at 1ms all the time with the moon... and I just get 2-5 frames only!! When you say you stack 50-100 I see that a monstruosity number of frames... What's the deal?

What do you mean by "be careful with your ASI cameras"?

 

And yes, i am from Arabia, you are welcome here :)



#28 MADRID SKY

MADRID SKY

    Viking 1

  • *****
  • Posts: 680
  • Joined: 20 Mar 2007
  • Loc: Madrid, SPAIN

Posted 06 December 2018 - 03:41 PM

What do you mean by "be careful with your ASI cameras"?

 

And yes, i am from Arabia, you are welcome here smile.gif

 

Thank you Tareq for your invitation. I meant by "being careful" that you avoid killing another camera (as your dead ASI120) by being careful. :) It was a simple joke on the heels of your broken ASI camera. Like... "take care of your living babies" (do not kill them).

 

God bless you! :)

Sam



#29 TareqPhoto

TareqPhoto

    Skylab

  • -----
  • Posts: 4265
  • Joined: 20 Feb 2017
  • Loc: Ajman - UAE

Posted 06 December 2018 - 04:23 PM

Thank you Tareq for your invitation. I meant by "being careful" that you avoid killing another camera (as your dead ASI120) by being careful. smile.gif It was a simple joke on the heels of your broken ASI camera. Like... "take care of your living babies" (do not kill them).

 

God bless you! smile.gif

Sam

Honestly speaking, that accident to my ASI120 was the best thing happened, it forced me to buy another ASI camera which gave me so nice amazing results and i won with one of them, and i already bought another one but didn't use it yet at all, maybe soon, and i am planning to add the third one [forth if i count ASI120], i can't stop lol.gif grin.gif

 

My invitation is always open, you are welcome here, and to do the moon and planets too, you will enjoy waytogo.gif



#30 MADRID SKY

MADRID SKY

    Viking 1

  • *****
  • Posts: 680
  • Joined: 20 Mar 2007
  • Loc: Madrid, SPAIN

Posted 06 December 2018 - 06:55 PM

Honestly speaking, that accident to my ASI120 was the best thing happened, it forced me to buy another ASI camera which gave me so nice amazing results and i won with one of them, and i already bought another one but didn't use it yet at all, maybe soon, and i am planning to add the third one [forth if i count ASI120], i can't stop lol.gif grin.gif

 

My invitation is always open, you are welcome here, and to do the moon and planets too, you will enjoy waytogo.gif

 

You have the right vision as to your broken device! Thank you for sharing it. Yes, "bad things" are usually simple life lessons that take us a step forward to "good things"... and in turn "good things" have also simple life lessons that want us to even grow more. So, both are "good"... IF we learn the lesson. What lesson? That neither "bad things" nor what we call "good things" last forever. They always point or press unto higher realities. If they were eternal, they would not be "lessons to learn", but we would call them another name altogether. There is nothing but one thing that is eternal, that thing that some of us wait for: an "eternal good-thing" called "The Reign of God". Namely, the Christ of God ruling over all, in which there is reconciliation and the standing of a good conscience before our Creator. Only in Him there is power and salvation to the peoples of this shaken earth... me and you included. And He is really at the door...

 

Thank you for your lovely invitation, sir. I take it whole-heartedly. 

 

Take care,

Sam


Edited by MADRID SKY, 06 December 2018 - 06:58 PM.


#31 TareqPhoto

TareqPhoto

    Skylab

  • -----
  • Posts: 4265
  • Joined: 20 Feb 2017
  • Loc: Ajman - UAE

Posted 06 December 2018 - 07:13 PM

You have the right vision as to your broken device! Thank you for sharing it. Yes, "bad things" are usually simple life lessons that take us a step forward to "good things"... and in turn "good things" have also simple life lessons that want us to even grow more. So, both are "good"... IF we learn the lesson. What lesson? That neither "bad things" nor what we call "good things" last forever. They always point or press unto higher realities. If they were eternal, they would not be "lessons to learn", but we would call them another name altogether. There is nothing but one thing that is eternal, that thing that some of us wait for: an "eternal good-thing" called "The Reign of God". Namely, the Christ of God ruling over all, in which there is reconciliation and the standing of a good conscience before our Creator. Only in Him there is power and salvation to the peoples of this shaken earth... me and you included. And He is really at the door...

 

Thank you for your lovely invitation, sir. I take it whole-heartedly. 

 

Take care,

Sam

Won't including religious or faith topic here, but i do believe in something called Fate or destiny or whatever you call it, and also part of our religion we believe than anything happens in this life, anything, is written and for reasons by GOD, so we don't fear or panic as it is nothing changed or can't do anything about it, even the free choices we do it is also written that we will make it good or bad, we even have a saying not sure how can i translate it to English, "A bad thing or sign might be good at the end or after all", so i just keep going if anything happen good or bad, and this accident whether i learnt a lesson or not i won't stop at it forever, some may become very complicated and torture themselves for those mistakes and accidents, i don't, i keep going, happened to my mount and i was like crazy, so i didn't give up, fixed the mount and it is back to work maybe even better than before.

 

The moon is a nice target that i am working at, i just got a solar filter film so now or soon i will start to do the sun as well with my ASI cameras, but i feel i want to try with my DSLR or Sony mirrorless camera instead just in case i want to have a full disc of the sun.

 

You welcome , Sam

 

Take care and clear sky

Tareq


  • MADRID SKY likes this

#32 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 07 December 2018 - 03:13 AM

Tom: thank you for the trick in AS3! That's so kind of you. I will try and see in the coming months... smile.gif What I don't get is that huge amount of frames that you stack. WIth the Moon I find that counter-productive. You usually want to stack a lot of frames to get better signal to noise in low contrast targets... but the moon should be all the contrary. You just use as less frames as possible to "freeze" the seeing and get your nice result. The more frames in your stack, the worse and more artifacts you get. With proper capture set up, I don't think it wise to get more than 5-10 pic stacks. I really don't understand the point. I capture at 1ms all the time with the moon... and I just get 2-5 frames only!! When you say you stack 50-100 I see that a monstruosity number of frames... What's the deal?

Sam, I could probably write for pages on this topic, based on personal experience as well as what I have read from others regarding their experience.  In the interest of time, however, I will try to keep this rather short.  Stacking frames accomplishes several things.  First, it increases SNR.  Second, the depth of the stack influences the amount of deconvolution you can subject the image to without it breaking down.  Third, the deeper the stack, the more bit depth it will have.  On the Moon this is important because it allows you to deal with the huge dynamic range issues inherent to capture with large sensors without having too much trouble recovering detail from the shadows.  However, the question of how many frames to stack is not a simple one, because the seeing conditions superimpose limitations on all these benefits of stacking.  You don't want to stack too many poor quality frames.  However, you can often benefit from stacking slightly deeper than you think, because of the noise reduction and the ability to apply stronger deconvolution.  For instance, your example of stacking only 2-5 frames strikes me as a tiny number of frames, while your example of 50-100 frames (which you described as a monstrosity!) is to me a rather low number, but one that is workable.  

 

One other item from your post that caught my attention is that you said you are capturing Moon images with shutter speeds of 1ms.  I don't see the benefit of that unless you are imaging at 1000 frames per second (which nobody is on the Moon). There is a lot of misconception about "freezing the turbulence", but the fact is if you have strong turbulence, you will never be able to freeze it.  A 10ms exposure is plenty fast for the Moon, and the only reason to go faster is if your frame rate is over 100 fps.  When I used my ASI224mc on the Moon, it was capable on 150fps.  In these cases, I used 6ms exposures, because that was the slowest shutter speed that allowed the full 150fps (actually 6.7ms).  This way you don't have to use unnecessarily high gain.  With my ASI183, it is only capable of 19fps in full frame.  In this case you actually are limited by the movement of the atmosphere, because you don't want a shutter speed as slow as 1/19s, but I I frequently image at 15ms shutter speeds which allows for low to moderate gain.  A 1ms exposure would necessitate very high gain and compromise quality to a large degree.  You would have to stack even more frames to reduce the noise.

 

I have created a series of cropped images, taken from data captured under excellent seeing conditions that also formed the basis of my lunar mosaic that I posted about a few days ago.  Keep in mind that the seeing was excellent, and here is the quality score graph from AS!3 (this was also posted earlier in this thread, but I am repeating here for convenience).  The green line represents the 500 frame mark out of 3000 total frames.  I can tell you that there is negligible visible difference between any of the frames in this region.  So from my perspective, there is really not much reason not to stack at least this many frames in this situation, for all of the reasons I outlined above regarding SNR, deconvolution potential, and bit depth.

 

quality_graph_good.jpg

 

 I used PIPP to create a small cropped video around Clavius, and then processed this video and created stacks of 1, 5, 10, 100, and 1000 frames.  I will now present those below, although each will have to be in a separate post due to file size.  I will add a brief comment about each image.  I chose Clavius because it is a popular imaging target that many people are familiar with, and it also has many small craterlets on its floor that can be used to gauge resolution and noise.  I have attempted to process each image in a similar manner, applying a slight amount of deconvolution, which varies depending upon the stack depth.  In all cases you will have to click to open full size.  

 

Single frame- unstacked:      Click to open full size

 

Clavius_single_frame.jpg

 

The quality of this single frame is actually quite good, but the noise is noticeable and limits the fine detail.  This single frame was essentially unable to accommodate any deconvolution without breaking down.  


Edited by Tom Glenn, 07 December 2018 - 03:35 AM.

  • MADRID SKY and dan777 like this

#33 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 07 December 2018 - 03:15 AM

5 frames stacked: 

 

Clavius_5_frames.jpg

 

This stack of 5 frames is a significant improvement over the single frame, although it is still very noisy. 



#34 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 07 December 2018 - 03:17 AM

10 frames stacked:

 

Clavius_10_frames.jpg

 

This stack of 10 frames is another significant improvement over the 5 frame stack, with slightly more detail visible, although still rather noisy.  



#35 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 07 December 2018 - 03:19 AM

100 frames stacked:

 

Clavius_100_frames.jpg

 

This stack of 100 frames is a significant improvement over the 10 frame stack.  Noise is much reduced, and more details are visible.  


Edited by Tom Glenn, 07 December 2018 - 03:20 AM.


#36 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 07 December 2018 - 03:26 AM

At this point there is not too much difference between the stacks of above 100 frames as far as resolution is concerned.  I am presenting the stack of 1000 frames below, although the differences are subtle.  You would probably have to load them into a stack and then flip back and forth to notice the differences.  The 1000 frame stack has slightly sharper detail, and is capable of being sharpened even further (more than the 100 frame stack) but in general any further sharpening than this starts to fall into the overprocessing category that I don't care for.

 

1000 stacked frames:

 

Clavius_1000_frames.jpg

 

During this lunar phase, Clavius is not in deep shadow, so this demonstration really only applies to SNR and resolution and deconvolution.  If I have more time later, I will do another example with a different subset of this data this is hidden in shadow, to observe how stack size (and resulting bit depth) affects the ability to stretch the histogram, which is another consideration, especially when using a larger sensor.  


Edited by Tom Glenn, 07 December 2018 - 03:28 AM.

  • Dartguy likes this

#37 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 07 December 2018 - 03:43 AM

I should also add that I did not apply any denoise or Gaussian blur to the above images.  Applying a Gaussian blur in Photoshop is a useful tool (I prefer it to denoise), and can be used here to make the very shallow stacks quite visually pleasing despite the original noise.  Here is the 10 frame stack with a 0.5 pixel Gaussian blur applied.  It is actually quite amazing what can be achieved with only 10 frames stacked, although the fine detail here is noticeably less than the 100 and 1000 frame stacks.  

 

Clavius_10_frames_GaussianBlur.jpg


  • dan777 likes this

#38 MADRID SKY

MADRID SKY

    Viking 1

  • *****
  • Posts: 680
  • Joined: 20 Mar 2007
  • Loc: Madrid, SPAIN

Posted 07 December 2018 - 11:21 AM

Tareq: well spoken! smile.gif

 

Tom: Thank you for all these images and your explanation. I knew almost all the stuff you shared here (through testing), but I thank you anyway this work and this information. NOW that I have seen one "raw single frame" of yours, I understand my problem with the moon. As a side-note, I am capturing full 16bits sensor dynamic range around 1ms shutter (and 0 or very low gain) and this fills nicely my histogram in my ASI174 camera with my scope and set up. My initial idea (with the Moon only, of course) was to have less frames stacked to only use as few as possible "best" frames in order to have the cleanest final picture. I didn't have the chance to test the "large stack" approach (>100). I will have to try when good seeing permits and see if it really pays off a larger stack in terms of final resolution. Each set up has its nuances. Well, back to the initial thought... having a look at your raw image, I just know the reason why I only stacked 3-5 frames in my FIRST lunar capture (the only night this far I could enjoy)... SIMPLE: seeing was SO BAD (much worse that I thought as I have not much experience with seeing metering due to lack of enough observing nights) that those were the only ones that could be used!! smile.gif There were more "sharp" frames (similar as the ones I stacked), but seeing was so bad that when I stacked more than those 3-5 sharp frames, "wobbling" made artifacts here and there apparent and overall S/N was not improved at all... I lost detail and resolution as I got more frames in my stack (I tested several stacks and AP sizes). There was so much "difference" from frame to frame that AS3! would get crazy in its magic and final stack would NOT be sharper or improve those single sharp frames. I have still wait for such conditions as to permit the kind of regular workflow that you talk about (trying to stack at least 100 frames with reasonable improvement over fewer frames stack) But before I have to solve my issues.

 

Right now I cannot use scope because I am solving serious vibration issues. ... basically I am working on silent blocks... (some anti vibration custom-made pads made out of raw Sorbothone sheets) to get stability when my scope is raised in its hydraulic platform. I have a very strange observatory (my scope moves up and down through the ceiling like the paralitic that Jesus healed in a crowded house!) I think that when I solve these issues, things will get easier for me when capturing. Up to this moment, when a small breeze of air 1-3km/h came... the scope wobbled as crazy. I am also changing my computer hard disks... I had SAS array, but my board don't like them so I have to simplify and go for SSD array through normal SATA ports... (sorry to tell you my pains and sorrows... ha!) I hope to be back soon. I also bought recently a 1,25" Baader UV filter to add to my 1,25" filter wheel and I also want to test it... boy those are expensive filters... 200€ each. Surely I hope to contribute some nice pictures in the future. Thank you again for sharing your little secrets... smile.gif


Edited by MADRID SKY, 07 December 2018 - 12:37 PM.


#39 jeffry7

jeffry7

    Vostok 1

  • -----
  • Posts: 146
  • Joined: 07 Dec 2017

Posted 07 December 2018 - 11:26 PM

Hi Tom,

 

Beautiful pics, and thank you for sharing your expertise with us.

 

You have mentioned several post processing steps like sharpening, deconvolution, Gausian blur and denoise. Can you talk about why and how you use these techniques and what software you use to realize them?

 

Thanks!


  • agmoonsolns likes this

#40 jeffry7

jeffry7

    Vostok 1

  • -----
  • Posts: 146
  • Joined: 07 Dec 2017

Posted 07 December 2018 - 11:32 PM

Hi Tom,

 

Another question I forgot to ask. Have you considered measuring the FWHM value for some star on a given night to keep a record of conditions for your captures? Does anyone do that, or is it just too OCD?


  • agmoonsolns likes this

#41 agmoonsolns

agmoonsolns

    Apollo

  • *****
  • Posts: 1183
  • Joined: 17 Sep 2018
  • Loc: Puget Sound

Posted 07 December 2018 - 11:35 PM

Yes, please! I would love to get more into photographing the moon, but feel overwhelmed because I don't know where to start. Whatever you would care to share would be so very much appreciated. I have been learning so much from this thread, thank you!

 

Hi Tom,

 

Beautiful pics, and thank you for sharing your expertise with us.

 

You have mentioned several post processing steps like sharpening, deconvolution, Gausian blur and denoise. Can you talk about why and how you use these techniques and what software you use to realize them?

 

Thanks!



#42 TareqPhoto

TareqPhoto

    Skylab

  • -----
  • Posts: 4265
  • Joined: 20 Feb 2017
  • Loc: Ajman - UAE

Posted 07 December 2018 - 11:56 PM

It is really not that dificult, if you have a good seeing and have good scope[with proper mount ofcourse] and a planetary camera with say a good filter then you have everything, i wasn't happy with my moon photography in the past no matter whatever gear i use, until this year it changed, now photographing the moon in my yard is like a piece of cake really, and i am sure OP also have all the requirements to do so, it is not only or software alone, that will come later as long you have the right equipment first to record videos with enough frames at best.



#43 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 08 December 2018 - 02:14 AM

Hi Tom,

 

Beautiful pics, and thank you for sharing your expertise with us.

 

You have mentioned several post processing steps like sharpening, deconvolution, Gausian blur and denoise. Can you talk about why and how you use these techniques and what software you use to realize them?

 

Thanks!

Hi Jeffry,

 

Thanks for your comments.  Below, I will expand on a few of my thoughts regarding image processing, with specific attention to the techniques you asked about.  This will be a general discussion about the principles, but when I have time later I will add a few examples of screen shots to describe what I do to an image to make it more practical.  Many of these terms have mathematical definitions, and although you don't actually need to know much about the details in order to use the tools effectively on your images, I find that it is quite interesting and informative to know a bit about the background of the techniques that are used.  The Wikipedia pages for these terms do a pretty good job of summarizing, and in fact you can skip most of the mathematical details and just try to get an overview (usually you can just read the first few sentences and that is sufficient for a basic understanding).  Some terms that relate to image processing, as well as many other areas of data analysis, include the following (all are clickable links):

 

Convolution

Deconvolution

Wavelet

Gaussian function

Normal distribution

Gaussian blur

Image Noise

 

Convolution is a mathematical operation in which one function modifies another function to arrive at a third.  Why is this relevant to imaging?  Because when light passes through Earth's atmosphere, it undergoes a convolution operation before it reaches your camera sensor.  So the data we collect with our cameras has been distorted (convolved) by the atmosphere.  If we could have some way of knowing what the mathematical operation was that distorted the data, we could undo it.  This is deconvolution.  In really good seeing conditions, the atmospheric distortion can be approximated with a Gaussian function.  This is simply a mathematical function with a certain shape that is often used to describe data that adopts a normal distribution (which is shaped like a bell curve).  If you apply the appropriate Gaussian deconvolution equation to data that has been convolved with a reciprocal Gaussian function, you can uncover the original signal (before atmospheric distortion).  The important concept here though is that the distortion to the data must be capable of being accurately mathematically modeled, and this is only possible if the seeing conditions are good, because in those cases the convolution was Gaussian.  In bad seeing, the convolution is pure chaos and cannot be accurately modeled.  In a nutshell, this is the theory behind image processing.  As an aside, adaptive optics platforms (which are used by professional observatories) attempt to cancel the atmospheric convolution effects before the light even reaches the camera sensor by using deformable lenses, but these also only work in good seeing conditions, because the distortions can only be accurately mathematically modeled under good conditions.  

 

Recently, with my Moon images I have been using AstaImage for deconvolution.  Wavelet sharpening is another form of deconvolution.  Wavelets are another way of modeling mathematical distortions to a set of data, and programs like Registax (or any other that uses wavelets) will sharpen an image, and are frequently used in the imaging community to try and undo the effects of atmospheric seeing.  My experience is that deconvolution and wavelets can yield very similar results, but everyone has preferences on which program to use and what modifications to make.  It takes some time to figure out your individual workflow.  With lunar images, the modifications are generally very simple.  As seen in my previous post above, my image of Clavius looks pretty good even with only one single frame.  If you don't have good seeing conditions, you will not be able to gain much from deconvolution.  I have been using Lucy-Richardson deconvolution in AstraImage, but wavelet sharpening in Registax works very well, and the only reason I'm not using it on my lunar images now is that it is much slower with my larger files from the ASI183 and it has annoyed me!  Most deconvolution programs will require you to enter two variables, one corresponding to the overall strength of the operation, and the other defining the mathematical function.  The later is usually referred to as a "point size" or "blur kernel size" and typically specifies the radius (in pixels) on which you want to apply a deconvolution equation.  It is fun to play around with these variables on your images, and you can quickly see how each affects the outcome.  

 

 Gaussian blur is actually a form of convolution, in which you are taking your image and applying a Gaussian function to it to create a blur.  Why would you want to apply a convolution operation to your data after you went to the trouble of sharpening it through deconvolution?  The answer is because often times, any form of sharpening (whether deconvolution, wavelets, or other) leads to an image that doesn't look natural.  When first starting out in imaging, it is very common to over-sharpen your images and not realize you are doing so.  With the Moon, you can look at images taken from other experienced imagers (or images from orbiting spacecraft) and compare to your own image.  Classic signs of over-shapening include the enlargement of small details beyond their actual size, or an unnatural looking contrast in the final image.  An analogy from planetary imaging is with the planet Saturn, in which classic signs of over-sharpening include a hugely expanded Cassini division that appears much larger and darker than it really is, as well as spurious ring divisions that don't actually exist.  A similar thing happens with small rilles and craters on the Moon if over processed.  Edges and contrast get exaggerated.  If the amount of over-sharpening is minor, then adding a slight Gaussian blur can smooth out the image and make it look more natural, although if it has been severely over-sharpened, a Gaussian blur will not correct this and you have to go back to the beginning and redo the deconvolution in a less aggressive manner.  

 

Denoising in general simply refers to any mechanism for reducing noise in an image.  Gaussian blur is one form, but there are others.  The reason I like Gaussian blur is that it applies a very aesthetic blur.  Other forms of denoising can also be useful, but you have to be careful because they can sometimes lead to a very unnatural looking result.  All denoising algorithms try and determine what is real signal versus noise, and only smooth out the noise, but often the result is not very natural looking, and you would have been better off keeping the original noise.    

 

I use AstraImage or Registax for the sharpening of my lunar images, and then I use Photoshop for the final editing of tonal variation and the application of any functions such as Gaussian blur.  When I have more time later this weekend, I will add another post here to expand on those details a bit.  Tonal variation is very important on the Moon, with common mistakes resulting in an image that has significant regions that have been clipped to pure white, or conversely, is overall extremely dark with unnatural contrast.  As long as the original data was not overexposed, these are relatively easy to fix.  


  • dan777, aeroman4907, jeffry7 and 1 other like this

#44 agmoonsolns

agmoonsolns

    Apollo

  • *****
  • Posts: 1183
  • Joined: 17 Sep 2018
  • Loc: Puget Sound

Posted 08 December 2018 - 02:17 AM

Terrific information, thank you!



#45 Tom Glenn

Tom Glenn

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1652
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 08 December 2018 - 02:26 AM

Hi Tom,

 

Another question I forgot to ask. Have you considered measuring the FWHM value for some star on a given night to keep a record of conditions for your captures? Does anyone do that, or is it just too OCD?

I have never considered this, because I don't want to spend any extra time doing something that would not be particularly helpful to my imaging.  I'm sure some people do this though.  I can get a sense of the seeing just by looking at the diffraction pattern of a star during collimation, as well as looking at the live image of the object I'm targeting.  If it doesn't meet a certain subjective standard, I don't image.  Measured FWHM are somewhat difficult to correlate to expected planetary/lunar performance, because the measurement is taken from much longer exposures.  Also, the star would have to be at the same altitude, and in the same general sky location as your target for the data to mean anything.  If your stars are very tight in long exposures, this is a sign of good seeing, but the reverse situation (bloated stars) does not necessarily mean that lunar/planetary imaging will be bad.  There are many different types of "bad" seeing.  If the turbulence is pure chaos, then it is pointless to attempt.  But there are situations in which there is moderate turbulence, but it happens to be fairly laminar, and although the image moves around a bit, you get a few split seconds here and there that are clear.  In fact, the images I posted at the very beginning of this thread fall into this category.  The seeing was not great, but was OK enough to get some decent images.  


Edited by Tom Glenn, 08 December 2018 - 02:46 AM.



CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics