Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

What's the future for planetary cameras?

  • Please log in to reply
15 replies to this topic

#1 dhammy

dhammy

    Apollo

  • -----
  • topic starter
  • Posts: 1,024
  • Joined: 20 Jul 2015
  • Loc: Puerto Rico

Posted 27 September 2020 - 12:07 PM

Thought I'd throw this out there for some Sunday fun.  

When I first got started in planetary imaging early 2000s I started off with the ToUcam Pro (I still have it for my imaging museum lol.gif ) and we've come a long way since then! I still remember being impressed with 60fps - that used to be high speed lol.gif  Are we also going to look back and think - wow, I remember when we thought capture rates in the hundreds were fast?  

 

So what changes do you think we'll see and what will it mean for the software we use or the way we capture images? 


  • Kiwi Paul likes this

#2 sg6

sg6

    Cosmos

  • *****
  • Posts: 9,019
  • Joined: 14 Feb 2010
  • Loc: Norfolk, UK.

Posted 27 September 2020 - 12:59 PM

I wonder if a "better" planetary camera is just considered as "better" because it has a faster frame rate not that the frames produced are actually any better and produce improved results.

 

What makes a 10" reflector better then an 8" reflector - the answer is almost 100% because it is bigger. Oddly never because the mirror is better quality. The quality aspect is never a point of consideration and very rarely mentioned. Have a read - every 8" mirror is "exactly" the same in the posts. Have to ask why do Zambuto bother? The answer is - Look through one and see the quality difference. Hardly a post says that.

 

Is "What makes a better planetary camera?" going the same way - it is faster, not that it is better. Just my camera is faster then your camera.

 

So much in astronomy seems to be be mine is bigger/longer/faster then yours.

Suppose that is one way the manufacturers convince astronomers to buy the next thing they offer at a slightly higher price however. Is 120fps faster the 60fps, so buy our new, more expensive (probably unnecessary) camera. And they do.


  • Kiwi Paul likes this

#3 Kiwi Paul

Kiwi Paul

    Mariner 2

  • -----
  • Posts: 281
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 27 September 2020 - 02:09 PM

I suppose in the case of telescopes, there is an assumption that the optics have a certain assumed quality and so beyond that, aperture is the big variable.

In the case of cameras, (let’s just think it through as far as we can) if we are able to use ones with thousands of images per second (and I’m pretty sure they exist - currently for slow motion photography) what would that result in? Obviously a higher number of the sharpest frames for stacking, which in itself is desirable, but the rate of distortion of the planetary image itself has some limit, so the improvement of the final image will have a limit too. I would think it would converge on the best of the planetary images that are available during the capture due to the nature of the seeing and of course in the end the quality of the image would still be limited by the seeing - which it is currently. No doubt the final image would be somewhat better, but there should be a limit.
Just thinking out loud...!
Cheers Paul
  • dhammy likes this

#4 martinl

martinl

    Ranger 4

  • -----
  • Posts: 356
  • Joined: 20 Oct 2006
  • Loc: Sweden

Posted 27 September 2020 - 02:29 PM

Planetary cameras are remarkably mature technologically. QE values of the latest models, like the ASI462, peak around 90% and probably average 70-80% over the visible spectrum. In terms of photon signal to noise ratio they thus already operate within 80-90% or so of the theoretical maximum. 
 

You could probably improve camera noise somewhat, and also extend the sensitivity at longer and shorter wavelengths, but for visual wavelengths I don’t think we’re going to see any massive improvements unless a completely different approach to sensor design emerges.

 

One area where cameras could still improve are in data transfer rates to allow high frame rates at larger ROIs but this would soon require us to upgrade from USB 3 to something faster. 


Edited by martinl, 27 September 2020 - 02:30 PM.

  • Kiwi Paul likes this

#5 Sunspot

Sunspot

    Cosmos

  • *****
  • Posts: 9,648
  • Joined: 15 Mar 2005
  • Loc: Surprise, AZ

Posted 27 September 2020 - 04:58 PM

For me I'd like to see a camera with a high and equal spectral response from about say, 200nm to 1000nm. Set one exposure and gain for all filters.


  • dhammy likes this

#6 KpS

KpS

    Mariner 2

  • -----
  • Posts: 253
  • Joined: 07 Nov 2009
  • Loc: Prague, Czechia

Posted 27 September 2020 - 06:14 PM

It sounds unbelievable, but maybe we will see cameras with quantum efficiencies greater than 100%.


  • Sunspot and dhammy like this

#7 dhammy

dhammy

    Apollo

  • -----
  • topic starter
  • Posts: 1,024
  • Joined: 20 Jul 2015
  • Loc: Puerto Rico

Posted 27 September 2020 - 09:22 PM

For me I'd like to see a camera with a high and equal spectral response from about say, 200nm to 1000nm. Set one exposure and gain for all filters.

That would certainly be useful! 

 

 

In the case of cameras, (let’s just think it through as far as we can) if we are able to use ones with thousands of images per second (and I’m pretty sure they exist - currently for slow motion photography) what would that result in? Obviously a higher number of the sharpest frames for stacking, which in itself is desirable, but the rate of distortion of the planetary image itself has some limit, so the improvement of the final image will have a limit too.

Yeah that's interesting. Say I currently capture 18k frames in a 2 minute Jupiter recording and stack 9k (1 minute recording) to give the final image. If the capture rate of a future camera was 1000fps for Jupiter (everything else being equal) then I would need only 9 seconds of good seeing in 2 minutes to get the same result. Better chances for better quality?



#8 Kiwi Paul

Kiwi Paul

    Mariner 2

  • -----
  • Posts: 281
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 27 September 2020 - 11:34 PM

Yes, that seems reasonable. But I guess you would still have to wait for the intervals of good seeing. Of course the size of the data files would be getting pretty big. One of the things I was driving at ‘though was that I am supposing that there would be some base level of clarity of the image as determined by the type of degradation of the seeing and so this would ultimately impose a limit of the available detail obtainable???
Cheers Paul

#9 BrettD

BrettD

    Explorer 1

  • -----
  • Posts: 57
  • Joined: 14 Aug 2020
  • Loc: Melbourne, Australia

Posted 28 September 2020 - 01:37 AM

Yeah that's interesting. Say I currently capture 18k frames in a 2 minute Jupiter recording and stack 9k (1 minute recording) to give the final image. If the capture rate of a future camera was 1000fps for Jupiter (everything else being equal) then I would need only 9 seconds of good seeing in 2 minutes to get the same result. Better chances for better quality?

Interesting .. depends on what you mean by "everything else being equal" ... the noise level of the new camera would have to be not equal!

 

There are only so many photons falling from the sky to be caught by our cameras.  As the exposure time decreases, at some point there won't be enough photons in each bucket to tell light from dark - even with a theoretical zero noise camera.  I think 1ms is still enough time ... if the camera was low noise enough.

 

last night I took 2 saturn captures back to back (both 5 mins)

1 at 46ms

1 at 7ms 

As expected, I had to stack 10,000 frames from the 7ms capture to get about the same  noise levels of a 1,500 frame stack from the 46ms capture.

 

Brett


  • KpS, Herr Mario and Kiwi Paul like this

#10 BrettD

BrettD

    Explorer 1

  • -----
  • Posts: 57
  • Joined: 14 Aug 2020
  • Loc: Melbourne, Australia

Posted 28 September 2020 - 01:47 AM

Here is the comparison from my experiment.

 

Conclusion ... if is wasn't for the atmosphere ... this whole thing would be sooo much easier lol.gif

 

fps_comparison.png

 


  • KpS, Herr Mario and Kiwi Paul like this

#11 Bart Declercq

Bart Declercq

    Viking 1

  • *****
  • Posts: 953
  • Joined: 21 Jan 2007
  • Loc: Haaltert, Belgium

Posted 28 September 2020 - 01:55 AM

It sounds unbelievable, but maybe we will see cameras with quantum efficiencies greater than 100%.

That won't help all that much for planetary observing - the main limit we've got now is shot noise, which is actually independent of which camera is used. This is also why the notion of 1000fps cameras is a little misleading, except for the very brightest of subjects (the Sun, Venus perhaps) there's just not enough light to get images with low enough noise to be useful at these framerates - there'd be so much noise it would be impossible to identify the sharp frames. Practical limits seem to hover around 200-400fps for Mars & Mercury, 200fps for Jupiter and perhaps Saturn and significantly lower than that for Uranus/Neptune.

 

But when you get into high-resolution deepsky using lucky imaging, there's still a lot of scope for improvement as there read noise starts to dominate the stack - currently best-case read noise is around 0.7-0.8e- (which is a *huge* jump from 10 years ago) - once we get that down to 0.1e- or lower, we're essentially counting the photons - having QE above 100% would allow you to get away with a little more read noise.

 

Once you're accurately counting photons, in the signal-to-noise ratio there's effectively no difference between 1000x1s an 1x1000s exposures so as long as you've got something in the FOV that's bright enough to align your frames on and estimate their sharpness, you'll get a sharper result with similar noise levels using many short exposures. (Of course, since deepsky cameras tend to be multimegapixel affairs rather than the tiny ROI's we record for planets, the data-volume to process quickly becomes overwhelming - but live-stacking with instant sharpness estimation could provide assistance there - say live-stacking 1000 1s frames but only stacking the frames above a certain threshold - you'd probably still want to acquire subframes for later stacking in software, but each subframe would consist of x actual frames live-aligned and live-stacked and autoguiding would be done on the actual image, rather than using a separate guider) - of course exposure times would be longer than the final exposure time as you might take 2000 seconds to reach the 1000 stacked seconds threshold, but if sharpness is significantly improved by doing so, you'd get cleaner, deeper images with less exposure and less need for post-processing.

 

These techniques are already being used, but the read-noise greatly limits their effectivity.


  • KpS and Kiwi Paul like this

#12 Tulloch

Tulloch

    Surveyor 1

  • *****
  • Posts: 1,966
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 28 September 2020 - 05:59 PM

And, a bit of noise is actually a good thing.

 

http://www.stark-lab...pthStacking.pdf

 

(Just scroll down to the appendix if you are not too interested in the maths :)).

 

Andrew


Edited by Tulloch, 28 September 2020 - 06:54 PM.


#13 BrettD

BrettD

    Explorer 1

  • -----
  • Posts: 57
  • Joined: 14 Aug 2020
  • Loc: Melbourne, Australia

Posted 28 September 2020 - 08:17 PM

And, a bit of noise is actually a good thing.

Not sure I agree.

There are numerous sources of "noise" and I don't think any noise introduced by the camera is ever a good thing.

I think the beneficial component the paper is calling "noise" is what I would call "randomness in the signal".

If a theoretical zero noise camera (ie perfect photon counting) captured this ... then everything in that paper would apply.

 

Brett



#14 AstroDan2015

AstroDan2015

    Apollo

  • *****
  • Posts: 1,189
  • Joined: 09 Aug 2014
  • Loc: Racine, Wisconsin

Posted 28 September 2020 - 09:04 PM

The future for planetary cameras is looking up! gramps.gif woohoo.gif ohmy.gif  looney.sml.gif foreheadslap.gif



#15 ryanha

ryanha

    Mariner 2

  • -----
  • Posts: 218
  • Joined: 05 Aug 2020

Posted 29 September 2020 - 01:34 PM

This is not specific to planetary cameras, actually more for DSO.... would be nice if there was a way to guide with your imaging camera.

Yes, I know for planetary this is a non-issue b/c you can do auto-guiding in FireCapture and feature tracking in SharpCap (essentially guiding with your imaging camera).

 

But for DSO, wouldn't it be nice to be able to do the same?

 

I assume the reason you can't do this is b/c your DSO subs/Lights are typically long (10+ sec) relative to guiding frames and I assume also that you can't read "pre-frames" from the camera without introducing additional noise (e.g. can't read every 3 sec from a 15 sec sub to use for guiding).

 

But that would be nice, huh?  

 

--Ryan



#16 RedLionNJ

RedLionNJ

    Skylab

  • *****
  • Posts: 4,070
  • Joined: 29 Dec 2009
  • Loc: Red Lion, NJ, USA

Posted 29 September 2020 - 01:53 PM

The future of cams for planetary imaging should including the following, but it's unlikely we'll get all of them due to planetary cameras being an afterthought for industrial sensor manufacturers:

 

1. Fastest download technology possible (successor to USB?) for maximum frame rates

2. Ideal pixel sizes - in reality around 2.0 microns for our f/10 SCTs

3. Really high-quality filters on OSC cams, maybe with bayer patterns not green-weighted

4. Incredible sensitivity with extremely low noise

 

Is that too much to ask? :)




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics