Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

[HELP] I want to control 4 Setups at the same time.

  • Please log in to reply
15 replies to this topic

#1 amajed172

amajed172

    Vostok 1

  • -----
  • topic starter
  • Posts: 153
  • Joined: 09 Jul 2018
  • Loc: Saudi Arabia, Riyadh.

Posted 26 April 2021 - 07:03 AM

Hi

 

I'm thinking of controlling 4 setups at the same time. each setup has a mount, main camera, guide camera, EFW, Dew controller and a Powered USB hub.

 

What's the best way to run them all remotely?

I'm thinking of going for one main PC with 4 VM in it. each setup in a separate OS so they don't get conflicts or anything. I have never done this before so I'm not sure if it's going to work. or is it the best way.

 

The other choice is a NUC for each setup. this one is more expansive and maybe harder to manage than a single PC with 4 VM.

 

And the last option is going with StellarMate raspberry pi for each setup (i don't really prefer this, I'm used to N.I.N.A.)

 

Are there better solutions? please share your thoughts. Thanks



#2 RSJ

RSJ

    Explorer 1

  • -----
  • Posts: 58
  • Joined: 19 Oct 2017
  • Loc: Maryland

Posted 26 April 2021 - 08:31 AM

Just my thoughts having supported IT equipment in my career...I'm a fan of keeping things simple.

In your proposal, I immediately thought of 4 guide cameras and imaging cameras traveling across the USB bus of one PC hosting 4 VMs.

I also use NINA and have the full compliment (EFW, EAF, etc) and I have two USB hubs to connect everything which are each plugged into the laptop docking station in the observatory. Separately, because I have encountered a USB lag with the guide camera when both the guide camera and imaging camera are traveling the same USB path; separating each to their own hub and then connecting the hubs directly to the laptop solves this for me. I would imagine 8 cameras would generate a lot of traffic.

 

Also, if the main VM PC were to crash or need a reboot, you would potentially be left with the choice to reboot and stop 3 imaging sessions to fix the 1 setup with issues, or continue with 3 and hope other issues don't arise.

 

I would propose a modular solution for this level of complexity, it would be 4 used laptops -one per telescope, and remote into them as needed with a 5th laptop. You could reboot any of them without impacting the others, and potentially have the 5th laptop as a spare.


  • t-ara-fan likes this

#3 MJB87

MJB87

    Gemini

  • *****
  • Moderators
  • Posts: 3,081
  • Joined: 17 Feb 2014
  • Loc: Talbot County, MD & Washington, DC

Posted 26 April 2021 - 08:40 AM

My setup is far simpler than the one envisioned here, just two mounts and a total of (up to) four cameras.  I tried using multiple instances of programs such as SGP, TSX, etc. running on one computer. In the end, I found it far simpler to just have two laptops -- one for each setup -- that I can connect to via VNC when needed for remote operation. Most of the software permits multiple installations. I did have to buy another copy of APCC-Pro (for simultaneous operation) but you can add a second license to an existing one for a substantial discount.



#4 amajed172

amajed172

    Vostok 1

  • -----
  • topic starter
  • Posts: 153
  • Joined: 09 Jul 2018
  • Loc: Saudi Arabia, Riyadh.

Posted 26 April 2021 - 09:10 AM

Just my thoughts having supported IT equipment in my career...I'm a fan of keeping things simple.

In your proposal, I immediately thought of 4 guide cameras and imaging cameras traveling across the USB bus of one PC hosting 4 VMs.

I also use NINA and have the full compliment (EFW, EAF, etc) and I have two USB hubs to connect everything which are each plugged into the laptop docking station in the observatory. Separately, because I have encountered a USB lag with the guide camera when both the guide camera and imaging camera are traveling the same USB path; separating each to their own hub and then connecting the hubs directly to the laptop solves this for me. I would imagine 8 cameras would generate a lot of traffic.

 

Also, if the main VM PC were to crash or need a reboot, you would potentially be left with the choice to reboot and stop 3 imaging sessions to fix the 1 setup with issues, or continue with 3 and hope other issues don't arise.

 

I would propose a modular solution for this level of complexity, it would be 4 used laptops -one per telescope, and remote into them as needed with a 5th laptop. You could reboot any of them without impacting the others, and potentially have the 5th laptop as a spare.

Thanks, yeah I can see how a VMware would be bad if I have to restart the main PC. but I figured that it will not happen that often, I might have to restart a single VM. but that will not cause any issue to the rest. 4 laptops (or NUC) is the better option most likely. but it's far expansive and it's a bit harder to maintain updates and stuff like that.

What I'm most afraid of when using the 4 VMwares in one PC is that the USB will not be enough for them. I figured that I would get an internal USB card to help fix the problem. but I don't know if that will work. also I'm not a fan of a long USB cables coming out of each setup and going to the main PC. maybe it's ok. but it might bring some issues that I might have a hard time figuring out.

 

My setup is far simpler than the one envisioned here, just two mounts and a total of (up to) four cameras.  I tried using multiple instances of programs such as SGP, TSX, etc. running on one computer. In the end, I found it far simpler to just have two laptops -- one for each setup -- that I can connect to via VNC when needed for remote operation. Most of the software permits multiple installations. I did have to buy another copy of APCC-Pro (for simultaneous operation) but you can add a second license to an existing one for a substantial discount.

Yeah. a PC for each setup is optimal. but as I said to RSJ, it's a bit harder to maintain updates and stuff like that. but it's most likely the best option so far. Thanks



#5 t-ara-fan

t-ara-fan

    Vanguard

  • -----
  • Posts: 2,023
  • Joined: 20 Sep 2017
  • Loc: 50° 13' N

Posted 27 April 2021 - 06:33 PM

 

What I'm most afraid of when using the 4 VMwares in one PC is that the USB will not be enough for them. I figured that I would get an internal USB card to help fix the problem. but I don't know if that will work. also I'm not a fan of a long USB cables coming out of each setup and going to the main PC. maybe it's ok. but it might bring some issues that I might have a hard time figuring out.

USB gremlins are the worst gremlins.  Imagine the headaches if the USB gets mixed up as you restart VMs.  You think you are working on scope "A", but actually you have camera "A" and focuser "B" and filter wheel "C".  Ouch. 

 



#6 amajed172

amajed172

    Vostok 1

  • -----
  • topic starter
  • Posts: 153
  • Joined: 09 Jul 2018
  • Loc: Saudi Arabia, Riyadh.

Posted 28 April 2021 - 01:52 AM

USB gremlins are the worst gremlins.  Imagine the headaches if the USB gets mixed up as you restart VMs.  You think you are working on scope "A", but actually you have camera "A" and focuser "B" and filter wheel "C".  Ouch. 

Wait, I did not even consider that! is it possible to mix up two identical cameras if I'm using VMs? I thought each camera has some sort of hidden identifier or something! like a serial number maybe?

 

If not. then I'm really going to go for NUC or raspberry pi.



#7 mark77

mark77

    Viking 1

  • *****
  • Posts: 842
  • Joined: 28 Jun 2015
  • Loc: PA

Posted 29 April 2021 - 11:46 AM

My alpaca drivers run on raspberry pi and will do exactly what you are looking for 

 

I will elaborate more when I get onto my main computer



#8 mark77

mark77

    Viking 1

  • *****
  • Posts: 842
  • Joined: 28 Jun 2015
  • Loc: PA

Posted 29 April 2021 - 05:32 PM

Now that I am sitting at a real computer instead of trying to use my phone while eating lunch.....

 

I have written Alpaca drivers for many different cameras, ZWO filter wheel (I am working on ATIK EFW). Moonlite focusers, custom domes/ror etc.

 

 

My setup is a home built 15 foot dome and I run everything from my basement.  I have a planetarium program that can talk to everything.  Its all open source (see links in my signature).

 

Alpaca is a new protocol (2019) that implements the ASCOM concept but over the network and DOES NOT REQUIRE WINDOWS!!!!!!  but CAN work with ASCOM on Windows if you so choose.

 

I have a total of 10 Raspberry Pi's controlling cameras, dome, focusers, rotators, filterwheels etc.

 

There are a couple of threads on the topic already

 

https://www.cloudyni...lpaca-vs-ascom/

 

https://www.cloudyni...-pi-and-alpaca/


Edited by mark77, 29 April 2021 - 05:32 PM.

  • Desertanimal likes this

#9 Voska

Voska

    Vostok 1

  • -----
  • Posts: 189
  • Joined: 28 Oct 2019
  • Loc: Beach Park, IL, USA

Posted 05 May 2021 - 09:29 AM

As an IT guy who has a similar end goal for my plans I would tell you to just get 4 computers (NUC's or Raspberry PI's or whatever). If you were willing to jump between 4 VM screens then it would be the same as jumping between 4 RDC/VNC sessions. Plus you get the piece of mind that if 1 system goes down the other 3 are still up. in my experience USB passthrough is still not great with VMWare.it works but CAN cause problems. 

 

It may be more expensive... but sometimes spending more money = less headaches or lost nights in the end.



#10 Raginar

Raginar

    Cosmos

  • *****
  • Posts: 9,805
  • Joined: 19 Oct 2010
  • Loc: Pensacola, FL

Posted 06 May 2021 - 05:58 PM

Hi

 

I'm thinking of controlling 4 setups at the same time. each setup has a mount, main camera, guide camera, EFW, Dew controller and a Powered USB hub.

 

What's the best way to run them all remotely?

I'm thinking of going for one main PC with 4 VM in it. each setup in a separate OS so they don't get conflicts or anything. I have never done this before so I'm not sure if it's going to work. or is it the best way.

 

The other choice is a NUC for each setup. this one is more expansive and maybe harder to manage than a single PC with 4 VM.

 

And the last option is going with StellarMate raspberry pi for each setup (i don't really prefer this, I'm used to N.I.N.A.)

 

Are there better solutions? please share your thoughts. ThanksVo

Voyager can do this. Probably use INDI to control each device at the scope and then use Voyager to do it.  Send Leo an email... he'll walk you through it.

 

Chris


  • psandelle likes this

#11 Christopher Erickson

Christopher Erickson

    Skylab

  • *****
  • Posts: 4,028
  • Joined: 08 May 2006
  • Loc: Waikoloa Village, Hawaii

Posted 18 May 2021 - 01:16 AM

I would put a NUC on each OTA.

 

Simple, short cables, most-reliable.

 

Each NUC running its own scheduling software in fully-automated mode.



#12 555aaa

555aaa

    Vendor (Xerxes Scientific)

  • *****
  • Vendors
  • Posts: 2,167
  • Joined: 09 Aug 2016
  • Loc: Ellensburg, WA, USA

Posted 18 May 2021 - 02:45 PM

What does it mean when you say control four setups? Do you mean synchronized is some way or are they all doing something differently?

#13 amajed172

amajed172

    Vostok 1

  • -----
  • topic starter
  • Posts: 153
  • Joined: 09 Jul 2018
  • Loc: Saudi Arabia, Riyadh.

Posted 19 May 2021 - 01:04 AM

Thanks everyone, most of you think that I should get NUC for each setup. this seems like the best option and I think I'll go with it.

 

What does it mean when you say control four setups? Do you mean synchronized is some way or are they all doing something differently?

They are doing something differently. each setup will be aimed at different target doing different stuff.


  • HxPI likes this

#14 astrokeith

astrokeith

    Apollo

  • -----
  • Posts: 1,091
  • Joined: 14 Mar 2010
  • Loc: Surrey, UK

Posted 19 May 2021 - 04:11 AM

I agree with having a machine at each mount. Anything else will end in tears I'm sure!

 

I have a NUC set up, but also many Raspberry Pi's. I'd consider or perhaps try a RPi - I find them perfect as controllers/servers. The final choice will be heavily determined by the specific hardware you have and what programs you want to run.


  • DuncanM likes this

#15 555aaa

555aaa

    Vendor (Xerxes Scientific)

  • *****
  • Vendors
  • Posts: 2,167
  • Joined: 09 Aug 2016
  • Loc: Ellensburg, WA, USA

Posted 19 May 2021 - 08:57 AM

The distance from the imaging camera to computer can only be a couple meters so that drives you to a dedicated computer per mount. In the four camera rig in my avatar they all go to one computer but we ended up having to add a second computer for focusing of all things.

#16 DuncanM

DuncanM

    Soyuz

  • *****
  • Posts: 3,571
  • Joined: 03 Nov 2009
  • Loc: Arizona Sky Village or the rain forest

Posted 19 May 2021 - 04:21 PM

Hi

 

I'm thinking of controlling 4 setups at the same time. each setup has a mount, main camera, guide camera, EFW, Dew controller and a Powered USB hub.

 

What's the best way to run them all remotely?

I'm thinking of going for one main PC with 4 VM in it. each setup in a separate OS so they don't get conflicts or anything. I have never done this before so I'm not sure if it's going to work. or is it the best way.

 

The other choice is a NUC for each setup. this one is more expansive and maybe harder to manage than a single PC with 4 VM.

 

And the last option is going with StellarMate raspberry pi for each setup (i don't really prefer this, I'm used to N.I.N.A.)

 

Are there better solutions? please share your thoughts. Thanks

Use a low cost laptop at each mount and then control all four via Chrome Remote Desktop (CRD) from a central work station. I use Astroart 7 to automate each imaging setup and then control/monitor them via my family room's work station's monitor (a 55in 4K TV and a 1080P projector). Laptop/netbooks are desirable for debugging and tuning the mount, scope and cameras, as this often requires hands on at the mount whilst watching a display.  CRD from a workstation will be by the far the easiest and most reliable way to control all 4 systems:

 

CRD dual control

Edited by DuncanM, 19 May 2021 - 04:24 PM.



CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics