I certainly would! I've tried using Lynkeos on my desktop, but haven't had much luck so I've been using AS!3 and Registax on my laptop. Not that I've had much better luck with that...
Absolutely. I'm using Windows, but will see if I can adapt.
Ok, just remember you guys agreed to it. Just realize that every planetary imager will tell you everything in the workflow is wrong from top to bottom. So be it. I like the images it produces.
The flow starts with good seeing and good altitude on the planet. If it's below 30˚, it's going to be quite difficult to get good results. Ironically, zenith isn't a panacea either—the sweet spot seems to be between 40 and 60˚. I can only assume this is because of the rapid volume of air increase as you decrease altitude from 40˚ to the horizon and the fast increase in tangential air mass velocity as you increase from 60˚ to zenith. Chromatic dispersion by the atmosphere is also a problem below 40˚. The jet stream velocity is a big factor, too—I rarely even set up for planetary if it's above 50 m/s. And I've started setting up well away from my house—the metal roof gives off a tall plume that takes hours to settle after sunset.
I already showed you my setup. With the 12.5mm eyepiece, I estimate I'm shooting at ~f/58. Yes, I meant to write f/58. I actually have no idea what the focal ratio is—I just try to get as many of the 1024×680 pixels of the 5× zoom across Jupiter. And don't go quoting Nyquist on me—we had an intense debate on resolution vs. detection over in the planetary forum and concluded that an aperture can easily detect features to at least 10× Rayleigh—so there is your reference spatial information rate! I'm sampling somewhere around 4.4× Rayleigh, so I'm nowhere even close to the Nyquist criterion for that. And at Ts = 1/30sec, Jupiter has the SNR for f/58. Don't believe it? Look at the image. Truth be told, I shim the eyepiece with rubber bands, like this:
But this is just to make Jupiter a little smaller so I can keep it on the sensor during poor seeing and gusty ground winds that conspire to drive it off the sensor crop—a frequent occurrence in the outback.
The capture and processing workflow is all simple after that:
I included the version numbers for the software I use. AstroDSLR and Lynkeos are consciously older versions, simply for their speed (their subsequent upgrades were catastrophic).
Capture in AstroDSLR is fairly straightforward:
(Click for full size.)
Note that I'm using AWB vice daylight. This is because we're getting an 8-bit JPEG off the LiveView and not a 14-bit RAW. When I capture with Daylight, the blues and reds are quite compressed in the stack. Planetary targets are in full sun, and AWB makes better (or at least more complete) use of all three histograms. I use the ISO12800 setting because it gives me the greatest control on the actual gain of the sensor. Yes, we're actually only controlling ISO—the capture settings are simulated because the actual Ts (at least for LiveView off the 600D/T3i) is fixed at 1/30sec. For the optimal gain, I just adjust the Ts setting until the histogram max is just under 50,000. For this capture, Ts was 1/200sec. By simple math, this was a gain of ~ISO1920; with the eyepiece unshimmed, I'm usually at 1/160sec or ISO2400. Make sure you click the Zoom button to get 5× zoom. And click the preview downscaling button—with the upscaling button, capture peaks at ~8 fps; downscaling takes it up to ~9.5. With all that set, just do a 200sec recording. Since I display my time with seconds on my menu bar, I just add 3 minutes and 20 seconds from after I clicked the record button and clicked "Save".
Once you've got the .mp4 saved, fire up Lynkeos 2.10. Drop the .mp4 on the window, and it will parse out all the frames. A curious note about Lynkeos: it will only load key frames. AstroDSLR seems to store every frame as a key frame, but the in-camera video does not. So if I want to use all the frames from an in-camera video at 30 fps, I actually have to rip the .mov into frames with a ripper. This approach is much simpler, and I get much better results. So here's the .mp4 loaded and parsed:
For alignment with Lynkeos, I find the alignment frame size needs to be about 50% bigger than the target's largest extent. It can be bigger—up to 300 pixels for a 5-pixel star—but it seems to be most accurate at 50%. In this case, that was 700 pixels. If your Jupiter drifted outside of that, start with a bigger box for a first pass, and then do it again with a 150% box.
A note about speed. If you want to watch the images update during the alignment, it can be mesmerizing to watch them all fly by. But if you want speed—as in the full stack aligned in seconds, turn this feature off in the preferences:
Once aligned, we move on to downselection based on quality.
I haven't figured out where the default settings come from, but the low setting of 0.08 doesn't correspond to anything I've ever shot. Just click the preview check box and increment the low frequency down until you find where you get the most interpixel details in the Fourier Transform view of the image. For mine, this was at 0.01. Then put the sampling box on the greatest extent you can put on the disk—in this case, this was 300 pixels. Note that if you scroll down through the images, the Fourier image is out of alignment. This is because Lynkeos does the Fourier Transform on the original frame, not the alignment. So just make sure you're using your "Reference" alignment image (typically the first one, unless you selected a different one during alignment). Unclick the "Preview" check box and click "Analyze". Seconds later, you'll have useful quality metrics:
Once you've got your metrics, you just find the cutoff value that gives you ~1024 frames. Another amusing oddity about Lynkeos: it can't stack 1024 frames. It will tell you it can, but when you get to stacking, it just won't finish. Like a soft divide by zero somewhere. So I target the value that gives me the lowest value above 1024 frames, which in this case gave me 1030. I promise you, 6 images won't make a difference in the stack. You can use the slider, but I just do this manually; I increment by 10, then by 5, then by 1, then by 0.1, and then by 0.01 to find the value. Once you've downselected, you're ready to stack:
Stacking is as easy as it sounds. But if you have several images that you want to batch process (like I did the ones in the first post), you probably want to use a common size for all of them. In this case, 720×600 framed it nicely.
If you want to save your work, now is the time to do so. Once you stack the image, Lynkeos will include the stacked frame in the project file if you save it. Oddly enough, you can't actually get to the stacked image—indeed, the stack is lost the moment you go to another pane, so you then need to restack. So if you want to save several MBs per space for the settings file this is the point to save it. (It makes a much bigger difference for RAW stacks of the deep sky.)
Click Stack, and seconds later you'll have a stacked image:
Save this as a 16-bit TIFF, and we're ready for the third step in the workflow…
Assuming there's still interest after all that…
Edited by BQ Octantis, 26 August 2020 - 04:13 AM.