I've become more critical over the past couple of years about focus. I'm wondering is there is a formula lurking out there that relates the focal length of a telescope, the pixel size on the chip, the guiding error (rms) and the seeing to give a theoretical limit for FWHM or HFD. I've found things online that get me some information but I keep wondering what to expect as a theoretical minimum FWHM (in arc seconds) from an particular imaging system.
I originally thought that as long as my guiding was well under the seeing, that error term would be irrelevant. I also thought that the Nyquist sampling equation meant that the minimum I FWHM I could expect was around 3 pixels no matter what the scope was doing. And, finally I thought that meant that as long as 3 times the image scale was a lot more than the seeing, my FWHM is determined by the image scale and nothing else.
Now I'm not sure at all that I understand how all these things go together. So, if anyone has some insights, I'd love to get them. I just put a new camera (ASI1600) on my PW 12.5 and I immediately got (apparently) around 1.5 arc seconds for a star in focus. The old camera (STF8300) with larger pixels was giving around 2.2 arc seconds with short focused exposures. This is confusing to me because the image scale in both cases if really small .31 for the new camera and .44 for the old camera and both (x3) are well under the seeing where the scope is located.