So, while clouds floated by last night, I decided that I should see whether the tried and true calculations for SNR actually rang true for my camera under more real world conditions. I found an interesting discrepancy that I hope that someone can help me with.
I took an old 30s exposure and brought it up in Pixinsight. It was 30s, and of an open cluster, so there was really no signal to speak of that would have a low SNR, so I figured I would just calculate the noise term of the background itself based solely upon skyglow. The skyglow would be the 'signal' and that would allow me to fill in the denominator part of the SNR formula. From what I can determine from translating the formulas to common sense, the noise term in an image should be the Standard Deviation of the image (or a sample of it). I took a sample of the image that was devoid of stars and had about 3000 pixels in it. I ran that through Pixinsight's statistics, and it gave me a std. dev. of about 80. However, when calculating the expected noise in the image via: sqrt(skyglow+RN^2+darkcurrent), it came out to only 50.
I converted everything to ADU, so I don't think that it's an electron to ADU conversion problem. I figure that either I have missed a noise term or that std. dev. of the image is not the correct way to determine the noise of an image. I know that I number of people on this forum like to delve into this kind of stuff, so I figure that I would throw out what I've got an see if anyone has any ideas. Also, if you calculate the SNR of an image differently, I would love to hear how you do it.
Clear skies to all,
-John