Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

PiFinder - Open source, self contained, plate solving finder for Dobs

3d printing Accessories Visual Software
  • Please log in to reply
210 replies to this topic

#1 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 01 March 2023 - 09:24 PM

I'd like to share with everyone a project I've been working on for a bit now, I call it the PiFinder.  It's a plate solving telescope finder based around a Raspberry PI, RPI HQ Camera, and custom UI 'hat' that allows no-setup telescope positioning, catalog filtering/searching, and push-to guidance without any additional equipment.  The unit fits on standard synta finder brackets and should work on any Alt/Az scope.  

 

It's a bit tough to describe, so here's a photo on my scope:

PiFinder_on_scope.png

 

Here's a better view of the hardware and some of the screens related to catalog/target selection, push-to and charting

banner_overview.png?raw=true

 

And an image from the user guide of the keypad/screen and UI

ui_reference.png?raw=true

 

* Fully open source design and software so you can build your own and contribute ideas and improvements

* No encoders / star alignment or other setup

* Quick (< 1.5s) exposure/plate solving to determine telescope pointing position

* Integrated Inertial Measurement Unit to update position while moving scope/between solves

* GPS integration

* Catalog search, filtering and object information display

* Push-to guidance via directional arrows, or real-time chart with target plotting

* Full chart plotting with optional constellation lines, reticle and DSO's.  Multiple zoom levels and the chart tracks with the scope as it's moved.

* Includes logging capabilities to record observations with sky conditions and some notes

* OLED screen with easily adjustable full-range brightness.  Bright enough for indoor/daytime use and gets dim enough for demanding dark sky sites.

* Can be built with rechargeable battery, or run from scope provided 5v power

 

I've used this for a few observing sessions now and it's really let me spend much more time at the eyepiece as it combines target selection, charting, pointing/finding and logging right at the eyepiece.  I hope some other folks find it interesting and potentially build their own!  A full user manual and build guide, along with everything else needed to build one, is available at:

 

https://github.com/brickbots/PiFinder

 

 

I want to thank Dale Eason for posting about his Telescope position system without encoders as it really made me believe it was possible and prompted me to dive into this project.  Thank you!


Edited by brickbots, 01 March 2023 - 09:26 PM.

  • Daniel Mounsey, Jon Isaacs, El Mitch and 34 others like this

#2 CharLakeAstro

CharLakeAstro

    Vanguard

  • *****
  • Posts: 2,330
  • Joined: 12 Jan 2015

Posted 01 March 2023 - 09:30 PM

That is quite an interesting device, thanks for sharing.



#3 ButterFly

ButterFly

    Fly Me to the Moon

  • *****
  • Freeware Developers
  • Posts: 6,602
  • Joined: 07 Sep 2018

Posted 01 March 2023 - 11:27 PM

Weight?



#4 MichaelBrock

MichaelBrock

    Vostok 1

  • *****
  • Posts: 105
  • Joined: 07 Dec 2020
  • Loc: Alachua, FL

Posted 02 March 2023 - 07:42 AM

That's fascinating!  I am still in the process of building my dob and was planning on stage two being adding DSC...but this might be my new plan.



#5 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 02 March 2023 - 12:49 PM

Weight?

Great question, I should have put that somewhere more prominently.  It's 370g without the internal battery and the battery I would recommend adds another 80g.  For reference, the Orion 8x50 raci finder weighs 450g, and a Telrad weighs 310 with batteries... so this should be manageable on most scopes!


  • ButterFly likes this

#6 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 02 March 2023 - 12:52 PM

That's fascinating!  I am still in the process of building my dob and was planning on stage two being adding DSC...but this might be my new plan.

Good on you for building your own scope, it's been a great source of fun and joy for me!  If you are keen to try the PiFinder I have some extra PCB's and would be happy to print up the 3d printed parts, pm me if you are interested.  Sadly, it can be tricky to find a Raspberry Pi4 online right about now, but I hear MicroCenter often has them in stock.  Everything else is pretty easy to get.


Edited by brickbots, 02 March 2023 - 12:53 PM.

  • Dtex likes this

#7 durangodoug

durangodoug

    Vostok 1

  • *****
  • Posts: 188
  • Joined: 08 Jan 2015

Posted 02 March 2023 - 02:02 PM

Very cool, brickbots!

 

How wide of a view does the camera need to be able to take in, say in degrees?

 

Context: At even 450g / ~1 lb would be a challenge on my scope to mount this to the UTA, but it occurs to me I might be able to mount it lower (it's a "two pole" design), further down closer to the center of gravity, extended to the side far enough that the UTA wasn't in the way of it's view.



#8 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 02 March 2023 - 02:44 PM

Very cool, brickbots!

 

How wide of a view does the camera need to be able to take in, say in degrees?

 

Context: At even 450g / ~1 lb would be a challenge on my scope to mount this to the UTA, but it occurs to me I might be able to mount it lower (it's a "two pole" design), further down closer to the center of gravity, extended to the side far enough that the UTA wasn't in the way of it's view.

 

I feel your pain with UTA balance.... on a side note, I've had pretty good success on my scope with using elastic (bungee) cord for 'counterbalance'.  Not sure if it would work for your scope, but here's some photos showing the basic arrangement.

 

The camera/lens combo provides a 10.6 deg fov... which is right in the range the authors of the Tetra3 solver recommend.  With that FOV I bet you could locate it further down without the scope blocking the view.  Potentially even on one of the poles with the go-pro style mounting, which is lighter than the dovetail adapter.  There's some weight savings to be had in the construction as well by either not using or cutting holes in some of the 3d printed parts. 


  • durangodoug likes this

#9 Pierre Lemay

Pierre Lemay

    Surveyor 1

  • *****
  • Posts: 1,787
  • Joined: 30 Jan 2008
  • Loc: Montréal, Canada

Posted 02 March 2023 - 05:32 PM

Very impressive! Congratulations both on the unit and the excellent instructions you provide on making and setting up the plate solver finder.

 

With Dale’s help I built one of his units last Summer:

 

IMG_1753.jpeg

 

Since mine was only the second prototype  of his design, some bugs came up and we corresponded several times to solve them. It now works well but I haven’t had the opportunity to truly use it yet.

 

I like that you added an OLED screen to the unit and push-to guidance available on the screen through an IMU interface, among other features.

 

I have one question: In addition to providing an image of where the scope is pointing on the OLED screen, can your unit transmit its position to the SkySafari app on a tablet through the RPI4's WIFI or bluetooth antenna? I like the OLED screen and I can see how you only need that to operate the unit, but it could be useful sometimes to see an image of the surrounding sky where the scope is pointing on a larger screen, if only to see what surrounds the object you are observing.

 

Thank you for posting and for all your work.


  • Moravianus, brickbots and TheMan027 like this

#10 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 02 March 2023 - 06:04 PM

Very impressive! Congratulations both on the unit and the excellent instructions you provide on making and setting up the plate solver finder.

 

With Dale’s help I built one of his units last Summer:

 

attachicon.gifIMG_1753.jpeg

 

Since mine was only the second prototype  of his design, some bugs came up and we corresponded several times to solve them. It now works well but I haven’t had the opportunity to truly use it yet.

 

I like that you added an OLED screen to the unit and push-to guidance available on the screen through an IMU interface, among other features.

 

I have one question: In addition to providing an image of where the scope is pointing on the OLED screen, can your unit transmit its position to the SkySafari app on a tablet through the RPI4's WIFI or bluetooth antenna? I like the OLED screen and I can see how you only need that to operate the unit, but it could be useful sometimes to see an image of the surrounding sky where the scope is pointing on a larger screen, if only to see what surrounds the object you are observing.

 

Thank you for posting and for all your work.

Thank you Pierre for the kind words.  Dale has done the community a great service by popularizing this general approach and I hope you get a chance to get out under the skies and try out the unit!

 

I have a long-ish list of features I want to implement on the software side, and sharing the solved position with other apps is on the list.  I'm not sure how this is exactly done, but the PiFinder can act as a wifi hub and I think it's not a big stretch from there.  I'll have to peek at Dale's code to see what's involved there :-)



#11 steveastrouk

steveastrouk

    Apollo

  • *****
  • Vendors
  • Posts: 1,367
  • Joined: 01 Aug 2013
  • Loc: State College, Pa.

Posted 02 March 2023 - 06:35 PM

Interesing you got good results from the Pi HQ camera. I've been wondering if that could be used for NASA's exoplanet watch program

Brilliant project. Can't wait to try it out



#12 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 02 March 2023 - 06:56 PM

Interesing you got good results from the Pi HQ camera. I've been wondering if that could be used for NASA's exoplanet watch program

Brilliant project. Can't wait to try it out

The camera module is surprisingly capable and based on a nice Sony sensor.  The interface/software on the Pi gives a lot of control.  With a nice lens or telescope, I think it could be pretty useful as an astro camera.

 

For my case, I'm using an inexpensive lens which is pretty poorly corrected.  Fortunately, for pointing purposes, it does not really matter much as the solver just uses 512x512 images and I find a bit of soft-focus actually helps as it spreads the star energy over a few pixels and helps the algorithm judge brightness.


  • KJL likes this

#13 Dale Eason

Dale Eason

    Soyuz

  • *****
  • Posts: 3,765
  • Joined: 24 Nov 2009
  • Loc: Roseville,Mn.

Posted 02 March 2023 - 09:47 PM

  I'll have to peek at Dale's code to see what's involved there :-)

You will find there is a separate program (encoder.py) running as a daemon listening on the correct IP port for request from SkySafari.  The solver program just has to save the RA and Dec from the  latest solve in a file.   That file will be read by that program and sent as a response. 


  • brickbots likes this

#14 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 02 March 2023 - 10:31 PM

You will find there is a separate program (encoder.py) running as a daemon listening on the correct IP port for request from SkySafari.  The solver program just has to save the RA and Dec from the  latest solve in a file.   That file will be read by that program and sent as a response. 

Thanks Dale!  That files give a pretty succinct description of the protocol, which was my biggest unknown... now I just need to buy the Sky Safari plus app for testing and get coding :-) 



#15 Dtex

Dtex

    Lift Off

  • -----
  • Posts: 11
  • Joined: 28 Mar 2016

Posted 03 March 2023 - 07:36 AM

Excited to try!

 

Sent you a pm regarding pcbs...

 

Thanks!



#16 KJL

KJL

    Apollo

  • *****
  • Posts: 1,053
  • Joined: 06 Jun 2012
  • Loc: Boston, MA

Posted 03 March 2023 - 04:53 PM

Love how you've folded the lens to be in line with the RPi case.

 

Thanks for keeping it all open-source. Looking forward to taking a peek at your code!

 

My own eFinder can solve at 8 fps (and I haven't even tried the latest Tetra3 branch), but while that significantly improves response time when the telescope is still it doesn't "fill in the blanks" when the scope is moving. I'm looking forward to seeing how you utilized the IMU.



#17 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 03 March 2023 - 05:04 PM

Love how you've folded the lens to be in line with the RPi case.

 

Thanks for keeping it all open-source. Looking forward to taking a peek at your code!

 

My own eFinder can solve at 8 fps (and I haven't even tried the latest Tetra3 branch), but while that significantly improves response time when the telescope is still it doesn't "fill in the blanks" when the scope is moving. I'm looking forward to seeing how you utilized the IMU.

The folded design really was a response to having the screen face the eyepiece when the camera was pointing 'Forward'.... which reminds me I need to make a version of the 3d printed frame for scopes with the eyepiece on the other side :-)

 

I'm super impressed you can get 8 solves a second out of your eFinder... does that include image acquisition time?  I generally need 1.25 to 1.5 second exposures to get stars down to magnitude 7 reliably with the particular lens I use.... but I suspect I could increase the sensor gain and get down below 1 second.  However that's still a far cry from .125 seconds!


  • KJL likes this

#18 KJL

KJL

    Apollo

  • *****
  • Posts: 1,053
  • Joined: 06 Jun 2012
  • Loc: Boston, MA

Posted 03 March 2023 - 05:21 PM

The folded design really was a response to having the screen face the eyepiece when the camera was pointing 'Forward'.... which reminds me I need to make a version of the 3d printed frame for scopes with the eyepiece on the other side :-)

The folded optics route reminds me of products like Dwarflab's Dwarf 2 that keep it "flat" in two dimensions without poking out in all directions like the Astroid eFinder.
 
Eventually I hope our collective efforts can result in an eFinder that's as small as a QuikFinder. I'm sure folded optics will play a part in that.
 

I'm super impressed you can get 8 solves a second out of your eFinder... does that include image acquisition time?  I generally need 1.25 to 1.5 second exposures to get stars down to magnitude 7 reliably with the particular lens I use.... but I suspect I could increase the sensor gain and get down below 1 second.  However that's still a far cry from .125 seconds!

You get at the heart of it: in my implementation the ASI120MM-S is constantly streaming a video to the RPi — think a process plucking the latest image out — and when a platesolve (also in its own process) is complete the program reads off the next image to solve.

 

So above and beyond the platesolve runtime, you have to add the capture/storage/read latency which is at least one order of magnitude slower. Interestingly, I don't notice that lag in real-world use, but then maybe I don't because I'm the programmer, lol ....

(Just some self-deprecating humor: obviously, the real reason the lag doesn't matter is that the sky doesn't visibly move in a second. You really get to appreciate all of that 8 fps response time ... it's actually 8.5 fps, reviewing my notes.).

Because that ZWO sensor is already sensitive, together with 2x pixel binning and a fast lens (Ukraine adaptation of a USSR projection 50mm f/1.2 lens), I can get away with a 1/10-second exposure and still have something left to solve, even through pretty significant cloud cover. This is pretty cool because I can then cap the platesolve to 1/exposure if I ever get it going faster, saving on battery usage.

But these are still early days: while the sky is the limit (no pun intended) real-world use-cases always threaten to derail everything. I personally won't tolerate the limitations of Celestron's StarSense, for example: to me, an "eFinder" has to work all the time like the RDF it replaces!

Also, maxing out the RPi 4 with multiprocessing like this is certainly a non-trivial energy hog (about 1.5 A @ 5V current draw). That's why being able to cap things like platesolve frequency will probably be important in the future.


Edited by KJL, 03 March 2023 - 05:28 PM.

  • brickbots likes this

#19 KJL

KJL

    Apollo

  • *****
  • Posts: 1,053
  • Joined: 06 Jun 2012
  • Loc: Boston, MA

Posted 03 March 2023 - 05:30 PM

I'd like to share with everyone a project I've been working on for a bit now, I call it the PiFinder.  It's a plate solving telescope finder based around a Raspberry PI, RPI HQ Camera, and custom UI 'hat' that allows no-setup telescope positioning, catalog filtering/searching, and push-to guidance without any additional equipment.  The unit fits on standard synta finder brackets and should work on any Alt/Az scope.  

Have you compare the RPi HQ camera to the new Camera Module 3?

 

I've only had experience with the ASI 120MM-S which I assume, by the numbers, is no slouch when compared to the RPi offerings, but there is a big difference in price and physical dimensions.



#20 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 03 March 2023 - 05:36 PM

Have you compare the RPi HQ camera to the new Camera Module 3?

 

I've only had experience with the ASI 120MM-S which I assume, by the numbers, is no slouch when compared to the RPi offerings, but there is a big difference in price and physical dimensions.

From what I can tell the sensor of the CM3 is nicer than that of the HQ camera, but it comes with an integrate lens.  If it were possible to swap the lens out for something longer with more aperture it would be an interesting option!



#21 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 03 March 2023 - 05:54 PM

 
Eventually I hope our collective efforts can result in an eFinder that's as small as a QuikFinder. I'm sure folded optics will play a part in that.

 

 

That's a great goal... I was pretty pleased to get the weight below a standard 8x50 RACI finder... and I think there is definitely room to shrink... so to speak.  Switching from a full Pi to a compute module would cut the thickness down considerably.  After that.. the camera/lens would be the most challenging part to shrink.  Perhaps that's where your question about the camera module 3 comes in?  I've just assumed it would not be suitable due to the FOV and small aperture, but it's probably worth a shot.  Extracting just the 512x512 center section would narrow the FOV considerably... Hmmmmmmmm

 

 

 

Because that ZWO sensor is already sensitive, together with 2x pixel binning and a fast lens (Ukraine adaptation of a USSR projection 50mm f/1.2 lens), I can get away with a 1/10-second exposure and still have something left to solve, even through pretty significant cloud cover. This is pretty cool because I can then cap the platesolve to 1/exposure if I ever get it going faster, saving on battery usage.
 

 

You've really inspired me to see if I can push the exposure times down.  I was so keen on using the IMU to cover for the 1.5 second capture/solve times that I probably didn't work enough with the camera parameters to see if I could push that lower.  Next time I have the unit out I'll see what happens to the solve quality with more sensor gain and shorter exposures!

 

I'm also running separate threads to handle the image capture and solving (+ more for the keyboard, ui, gps, and IMU).. but since the exposures take so long and the solve is so quick with Tetra3 I'm not sure I'm seeing much benefit from overlapping the two tasks.  Maybe if I can get the exposure time down below 1 second thinking1.gif


  • KJL and ButterFly like this

#22 Dale Eason

Dale Eason

    Soyuz

  • *****
  • Posts: 3,765
  • Joined: 24 Nov 2009
  • Loc: Roseville,Mn.

Posted 03 March 2023 - 06:23 PM

I too use two threads.  One for the camera exposure and one for the solve.  Like you the solver seems to keep up and must wait for an exposure.  For what ever reason I find the Tetra solver almost always slower than the Astrometry one.  The Tetra also failed to solve more often than the Astrometry one.  

 

Skysolve weighs 249 grams.  It is fed by a separate battery placed off the OTA.


  • brickbots and KJL like this

#23 ButterFly

ButterFly

    Fly Me to the Moon

  • *****
  • Freeware Developers
  • Posts: 6,602
  • Joined: 07 Sep 2018

Posted 03 March 2023 - 06:25 PM


You've really inspired me to see if I can push the exposure times down.  I was so keen on using the IMU to cover for the 1.5 second capture/solve times that I probably didn't work enough with the camera parameters to see if I could push that lower.  Next time I have the unit out I'll see what happens to the solve quality with more sensor gain and shorter exposures!

The concern here is just hot pixels.  Subtracting a background can take care of those fairly easily, and it's quick.  There is no reason to not be on max gain if you can keep the exposures short enough to avoid saturating.  Tetra3 is only down to mag 7 with the default catalogs.

 

Extracting just the 512x512 center section would narrow the FOV considerably... Hmmmmmmmm

That could mean a larger catalog to search through.  It's a balance.
 


  • brickbots and rkinnett like this

#24 Dale Eason

Dale Eason

    Soyuz

  • *****
  • Posts: 3,765
  • Joined: 24 Nov 2009
  • Loc: Roseville,Mn.

Posted 03 March 2023 - 07:18 PM

 

That could mean a larger catalog to search through.  It's a balance.
 

It gets tricky trying to guess what 512 x 512 means.  With the RPI HQ camera this means that the field is still the same size provided by the lens and sensor however the pixels are binned together to get you 512 x 512.  At least that is what I think I discovered.  I found for my system telling the camera to take 800 x 600 images was a very good trade off for speed and solving ability. 


  • brickbots and ButterFly like this

#25 brickbots

brickbots

    Vostok 1

  • *****
  • Vendors
  • topic starter
  • Posts: 163
  • Joined: 02 Oct 2010
  • Loc: Los Angeles, CA

Posted 03 March 2023 - 07:20 PM

The concern here is just hot pixels.  Subtracting a background can take care of those fairly easily, and it's quick.  There is no reason to not be on max gain if you can keep the exposures short enough to avoid saturating.  Tetra3 is only down to mag 7 with the default catalogs.

 

Even more reason for me to revisit this... at max gain I feel like I saw some sort of issue with lines of hot pixels (seemed like bleed of some sort) from the sensor, but I did all of my gain/exposure testing very early in the process and I've definitely learned a lot about controlling the camera and Tetra3 since then.  Thank you!




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics





Also tagged with one or more of these keywords: 3d printing, Accessories, Visual, Software



Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics