Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

terrestrial imaging experiment using EAA techniques

  • Please log in to reply
6 replies to this topic

#1 Tthrift

Tthrift

    Lift Off

  • -----
  • topic starter
  • Posts: 4
  • Joined: 20 Sep 2021

Posted 19 October 2021 - 11:43 PM

Have any of you happened to try standard EAA techniques on static terrestrial scenes ?

 

I am hopeful that lucky imaging (or some sort of drizzle technique) 
will be effective in reducing heat and wind disturbances
from static terrestrial scenes at high magnification.

 

To this end I just received a small SCT (Celestron Starsense Explorer DX 5")
and think that I want to attach a ZWO ASI485MC to the back of it.

If the experiment with this scope pans out I will later need to increase the resolving power with a larger aperture.

 

Aside:

This evening I put a zoom eyepiece the new telescope and tried it out for its designed purpose. 
The Starsense phone app/mount did find itself once. 
Given the seeing conditions (city lights, full moon rising, broken wispy clouds reflecting light),
I think it did ok. 

I did get a pleasant view of Jupiter and 4 moons. 

Noticeably clearer than the inexpensive 4" reflector this 5 inch SCT replaced. 

 

Back to the task:

I'm new to this. 
The optical path to the camera needs to match the OTA's backfocus amount/range ?
I have not found that specification on the manufacturer's site. 

 

I suppose the camera setup distance needed could be estimated  by the focal distance consumed by what they sent (screw on visual back, erecting prism/diagonal and eyepiece).  Maybe I can find the specs for those.

 

I assume I will need to replace the "visual back" with some sort of adapter. Then add a filter holder and the appropriately chosen spacer(s) that come with a ZWO camera.

Am I missing a component in the imaging chain?

 

Thanks for any suggestions.

See while you can,
-Terry-



#2 jerahian

jerahian

    Surveyor 1

  • *****
  • Moderators
  • Posts: 1,542
  • Joined: 02 Aug 2018
  • Loc: Maine

Posted 19 October 2021 - 11:54 PM

I could be wrong here, but it sounds like you're thinking more of high-speed video imaging used for planetary/lunar/solar imaging (i.e. lucky imaging) vs. EAA, which is live stacking images as they are captured though not at the high-speed frame rate of planetary.

 

Let me know if I'm correct and I can move your topic to the "Major & Minor Planetary Imaging" forum.

 

CS, Ara



#3 Tthrift

Tthrift

    Lift Off

  • -----
  • topic starter
  • Posts: 4
  • Joined: 20 Sep 2021

Posted 20 October 2021 - 12:22 AM

Jerihian/Ara:

 

Thanks for the consideration. 

You know the community best. 

Sorry if I chose the wrong forum.

 

I'd like the system to be live and not offline. 

Ideally viewing a result as it accumulates on a phone or tablet in the field. 

 

I read somewhere that exposure times around 10ms might tend to stop atmosphere in it's tracks? 

That does sound like processing 100+ Hz video. So, I see your point.

 

But afaik real time 100 Hz frame processing does not match small, light and portable on a raspberry pi budget.

As much as anything I'm wondering what might be reasonable to expect from existing live workflows.

 

-Terry-


Edited by Tthrift, 20 October 2021 - 01:14 AM.


#4 Noah4x4

Noah4x4

    Aurora

  • *****
  • Posts: 4,643
  • Joined: 07 Apr 2016
  • Loc: Colchester UK

Posted 20 October 2021 - 01:31 AM

In the decribed mode, whether Astrocam, webcam, camcorder or similar, you are watching a succession of frames at a fast frame rate, typically 15 fps or faster. That speed deceives the eye and you see continuous video.

For 'lucky' planetary imaging you during post-process stack the individual frames that tend to be fuzzy due to atmospheric conditions and that improves image quality, but you DON'T live stack. For DSOs you can either post process or live stack. However, any stacking process requires multiple reference points, such as stars or lunar craters or planetary features. With terrestrial activity, firstly you won't need to stack anything, secondly, it is simple video technique. I often (daytime) film birds on a nearby chimney to test stuff, but it isn't EAA or lucky imaging as the image isn't being enhanced by stacking, nor will it be enhanced by stacking.

#5 Tthrift

Tthrift

    Lift Off

  • -----
  • topic starter
  • Posts: 4
  • Joined: 20 Sep 2021

Posted 20 October 2021 - 10:21 AM

Phil:

Thanks for responding. 

And for the nice comparison of 'lucky post' vs 'live stacking'.

 

Maybe I'm off base.  But I still think that some form of stacking might be able to accomplish my goal.

 

In distant still life scenes I would like to reduce the distortion caused by mirage. 

 

Since mirage distorts local parts of an image differently,

it seems like stacking every image would not reliably reduce the mirage.

But I haven't yet tried it.

 

I'm thinking what might help is to stack only the images with lower mirage distortion.

This could improve the detail of things that don't move.

Moving objects (birds, leaves, etc) would distort or disappear.

 

Identifying images with low distortion to include in a stack could be compute intensive and limit the stacking speed. 

I am hopeful that a standard astronomy technique might help me cull bad images.



#6 jprideaux

jprideaux

    Viking 1

  • -----
  • Posts: 847
  • Joined: 06 May 2018
  • Loc: Richmond, VA

Posted 20 October 2021 - 12:12 PM

One thing I have done is to attach a new Nikon zfc camera at prime focus to my AT92 refractor.  There is a Nikon mobile app that can connect to the camera via wifi that lets you have a live-view as well as adjust things like ISO and shutter-speed which will effect the live-view brightness. The app also lets you select the manual camera mode.  You can also take a picture through the mobile app and then view a lower-resolution version of the picture through the mobile app.  Of course the camera itself can save a much higher resolution version (or raw file if configured).  There is also an hdmi output port on the camera in which you can run a cable to a hdmi monitor to see the live-view.   

 

I'm not sure about other Nikon models or other camera brands, but assuming most have similar offerings, it is possible to see whatever the camera can see through an external monitor and also control the camera functions through a mobile app including setting ISO, and shutter-speed as well as taking pictures and then viewing those pictures (all without touching the camera except for turning it on).  

 

In comparing the live-view from the hdmi cable to external monitor against the live-view transmitted to the mobil-app, there is a slight time-delay for the transmitted live-view relative to the direct hdmi and I think the direct-cable hdmi out may be able to be a bit higher resolution (but not sure about that);



#7 Tthrift

Tthrift

    Lift Off

  • -----
  • topic starter
  • Posts: 4
  • Joined: 20 Sep 2021

Posted 20 October 2021 - 11:38 PM

jprideaux:

Your Nikon seems like a solid way to take pictures remotely and a fun retro styled camera.

Thanks for the info about your setup and suggesting a possible equipment/tool direction.  
 




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics