I've been getting a lot more mileage out of my Jupiter captures through my 7-in Mak-Cass this year compared to prior years. I now have a standard capture workflow that gives me good results (with the latest piece of the puzzle being capture with auto white balance of all things!). Plus a very efficient processing workflow from last year (that I affectionately call "filmstrip" processing) that allows for bulk processing of my stacks.
The capture workflow is simple 5× zoomed LiveView capture at 8-9 fps (with AstroDSLR 1.3). For Jupiter, I magnify the disk to where it takes up as much of the sensor as possible while still being able to keep it in the FOV of the sensor by chasing it with the mount hand controller. I can now do this from my grass lawn with concrete blocks under the tripod feet—they let the system damp out the motion from the wind gusts way better than with the feet just in the grass or dirt. I do 200 second LiveView captures, output as .mp4 files. 200-second captures create a wee bit of elongation on Io in the stack aligned on Jupiter, but otherwise allows for enough frames to meaningfully downselect to good seeing frames in my stacker (Lynkeos 2.10). I use the same alignment and quality assessment parameters for all the stacks and downselect to the best >1024 frames (strangely, Lynkeos hangs with exactly 1024 frames), which gives me decent results without derotation at 50-70% scaling in moderate-to-good seeing.
Once I've stacked, in Photoshop I just use File → Scripts → Load Files into Stack… to import all the stacks and arrange them into a "filmstrip". Though not necessary, it's quite helpful to have the same stack size across all stacks for parsing after processing. While I usually do this, my shot from 6 July had Io moving away from Jupiter, so my crops for the stack kept growing. This just requires a little extra trimming later. Over a 30 minute window (through my aperture), the channel offset (from atmospheric chromatic dispersion) is typically the same for all the stacks. The sharpening parameters also don't typically change all that much over 30 minutes. This is what allows for bulk processing. Before applying deconvolution and wavelets, I apply Smart Sharpen per channel in Photoshop with the least sharpening at the smallest pixel size that brings the noise in the polar region to the same level for all channels. I then apply deconvolution and wavelets in Lynkeos on the collective (with a 16-bit sRGB TIF being the exchange format between Photoshop and Lynkeos). Here's the output for parsing from that workflow at full scale in the 'bin:
From there, it's just select, copy and paste into a new document (with a black background) as layers in sequential order. I align the planetary disks, rotate to horizontal (with the equatorial belts as my guides), and use the animation tool to create an animation from the frames. Once I've got the animation I want, output to an animated 8-bit GIF is straightforward from Photoshop (File → Save for Web and Devices…). However, output to a 24-bit animated PNG requires a few more steps. I make frames from the layers (Animation window → dropdown menu → Make Frames From Layers) and then export those frames to 24-bit PNG files (File → Scripts → Export Layers to Files…). I have to reverse the frame order first (Layer → Arrange → Reverse) because Photoshop outputs from the top layer down, while my animations are usually from the bottom layer up. The PNG files are then just the input to my APNG stacker (Animatrice):
An astute observer will recognize that planetary rotation offers an opportunity for synthetic 3-dimensional images for the production of a stereogram. This is particularly useful to pull a transiting moon up above the disk to resolve it in three dimensions!
(Click for cross-eye stereo pair @ 100%)
And if you're feeling really adventurous, you can even create an animated stereogram:
Hopefully, you found this instructive.