Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Initial tests -- VR headset and camera HDMI out

  • Please log in to reply
27 replies to this topic

#1 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 01:17 AM

This, IMO, makes all the vr stuff actually worth it. HOLY CRAP, that was cool! Equipment used:

Playstation VR headset

Fuji X-T3

rechargeable battery pack

 

And that's it. Just hooked the X-T3 HDMI out into the box that comes with the headset, turned it on, put on the headset and... Yeah! That rocked. Used a couple of different lenses, found generally that for this set-up the best visuals happen with f/4 and faster lenses (for star observations, at least). I'm gonna go try and see what I can see 'cause Orion should be up about now, but I wanted to come inside and report back because that was really cool. I was just holding the camera, no tripod or anything. I also did a session with SharpCap -- that was neat, but the HDMI straight out of the camera is way better w.r.t. quality (at least with this particular set-up).

If nothing else, this is THE WAY to get focus with a DSLR/Mirrorless IMO. I will put in much more time on this tomorrow night, but it was really easy to see when things were in focus as the screen in the headset is very large. 10000x improvement over using the LCD on the camera.

So, to camera makers everywhere... I have a bunch of ideas on how to make this system really rock. "Clean" HDMI output is essential. Wireless HDMI transmission (a-la drones, depending on the resolution that's possible) would be very cool with a headset that could receive it. 4K resolution would be absolutely bonkers (1080p is the resolution for the Playstation VR headset). You just plug a headset into the camera, and/or use wifi and/or use some other thing, and away you go. It's fun to just pan the camera around and see what's up there. I could almost see the blue in Pleiades (I have a huge Tokina AT-X that I'm gonna try tomorrow night, as that's 300mm and f/2.8).

Anyways, more to come, but just try this. If a phone/ipod thing had HDMI input you could use the cheaper VR goggles. Next on my list is true 3D "binoculars" with two ZWO cameras, lenses, and whatever string and duct tape is required to get it to work. I think it'll be pretty darn cool once everything is hooked together (assuming I can sell stuff so I don't run out of money and/or time). I have an iexos-100... If one of these VR displays could hook up to that with ExploreStars, and you could use the controllers to move the mount, and somehow you could mix that w/the HDMI output from the camera (you can do this on a laptop, of course, but it'd be cool if it could be done with just the goggles, the mount, and a camera)...

Time to go have some more fun.


  • GilATM, S1mas, jeremiah2229 and 1 other like this

#2 jeremiah2229

jeremiah2229

    Surveyor 1

  • *****
  • Posts: 1,699
  • Joined: 25 Jul 2015
  • Loc: Illinois, USA N 37° W 89°

Posted 29 October 2020 - 01:28 AM

Please post some pictures as you progress.

 

 

Thanks...



#3 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 02:59 AM

Kinda hard in that the experience is a "live" thing. I saw color in Orion in real-time... Not the high-res photo quality color or anything like that, but as I fine-tune this I think I can get closer to that point. It was awesome. I'm in Salt Lake City, Utah and tomorrow I'll head out to my undisclosed observation location where there's class 2 skies... Even with the almost-full moon, this is gonna be fun. I experimented with using the iexos mount as essentially a huge gimbal to support the Tokina AT-X (couldn't wait)... Aside from not being able to find the thing I was looking for, it worked great to stabilize so I could observe for longer periods. Of course, turning the mount on and setting it up should be a blast, which is what I'm gonna do tomorrow night. Do one of the tours that are programmed into the mount, probably with this same lens, and just sit back and watch all the fun happen...


Edited by poserp, 29 October 2020 - 03:00 AM.


#4 Binofrac

Binofrac

    Vostok 1

  • -----
  • Posts: 135
  • Joined: 11 Jul 2019
  • Loc: Kent, UK

Posted 29 October 2020 - 04:14 AM

Excellent work Poserp. I'm purely a visual observer but often wondered if it was possible to connect a camera direct to a vr headset and get a reasonable view. Cameras are getting better all the time and I imagine that at some point they may replace the monochrome night vision systems. It would be great to have a set of vr `binoculars` that could have a useful zoom, be electronically stabilised and annotate objects within the view. They could even show colour and have arrows to guide you to objects as required. A sort of real view with the functionality of Celestron's Sky Portal. Watching a view of the sky speeded up slightly to show the motions of the heavens would be awe inspiring (if it didn't make you seasick).

 

A while ago after seeing a homemade cardboard vr headset with a smartphone, It seemed like it would be possible to make a version that superimposed a view of the phone's screen with a real view. The phone would show a planetarium program that would then show annotations on the night sky. The search functions would be ideal. After a few experiments it soon became apparent that this cheap version would suffer due to the phone orientation sensors not being that great. I've seen Universe 2 go advertised which is probably much the same thing.



#5 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 09:59 AM

So what I'm going to try is one of these:

https://www.amazon.c...03983143&sr=8-3

Laptop and camera on input, output goes to the headset. That way I can have ExploreStars on the laptop running and I can switch between that and the live feed from the camera. You could also use something like that with a tablet, in this case since I'll have the goggles on I think I'd want use the laptop because I can already find keys and such without having to look at the keyboard. A video mixer could also be used to fade between the two sources, although that costs more. But you can use it to superimpose one image on another, so if you were running, say, kstars you could superimpose that over the live view, then zoom in/out to match what you're seeing. Hmm.... Time to save some pennies and/or sell some stuff, I think.

$329 vs $9, but it would get the job done (I think):

https://www.amazon.c...nt-items&sr=1-1


Edited by poserp, 29 October 2020 - 10:02 AM.


#6 AttercopSmaug

AttercopSmaug

    Lift Off

  • -----
  • Posts: 8
  • Joined: 22 Oct 2020

Posted 29 October 2020 - 10:01 AM

Wait what! I have a oculus rift s how and what! This sounds ground breaking. Ive had this idea but with the pictures your using it for focusong and observing . I asume your using the live view HDMI from your dslr and your then connecting it to your laptop

#7 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 10:58 AM

Wait what! I have a oculus rift s how and what! This sounds ground breaking. Ive had this idea but with the pictures your using it for focusong and observing . I asume your using the live view HDMI from your dslr and your then connecting it to your laptop

YUP! I'm not sure if those goggles have an HDMI input or not. The video quality is way better direct from the camera than from the laptop using an HDMI dongle, or at least as I'm hacking this together that's the current state of affairs. So, one thing I can do is view all of the screens on my camera in the goggles. So, I can set this up such that I can switch from ExploreStars (or whatever application) on a laptop/ipad/etc to the camera view -- I goto to target, switch to the camera view. On the Fuji there's nice dedicated knobs for everything, so I can set up ISO, exposure time, interval shooting, all of that stuff directly with the goggles on and feel my around on the camera to change settings. The way Fujis work is you have "clean" HDMI output when the camera is in video mode. It also has a nice "panoramic" mode, which I never use BUT it puts crosshairs smack dab in the middle of the display overlaid with a live view of what the camera is seeing... I can use that when doing star alignment to ensure whatever I'm viewing is centered properly.

For camera-makers -- it'd be cool if a camera had a display like the reticle in a polarscope. In-camera GPS + reticle. I'm thinking a camera could have a full-on "astronomy" mode on the dial, complete with adjustable real-time image stacking and plate solving (put a star DB on the flash card...), where you can adjust how many frames are stacked SO... If I'm shooting at 60 FPS, I can say "stack 10 frames" and I get a 6 FPS output of 10 stacked images. Or any number. Or just a cumulative live stack with alignment. Sharpcap on-camera. MAKE IT SO. ST-4 port? There's so many small-ish things one could add to a DSLR/Mirrorless to make this next-gen...

For now, next up is to get HDMI switching and figure out the best way to use ExploreStars -- maybe an ipad will be cool 'cause I can move the mount with a finger on the touchscreen and it's physically easier to carry around. I have all the auxillary gear ATM in a backpack, last night I shoved it all in there so I could move around wherever and just point the camera at stuff... Here is a use for those fast zoom lenses that may not take the best astro stills, but could work GREAT for this. I'm also thinking cameras like the Sony RX10-MK_ could be cool because you have that crazy zoom range and it can be remote controlled...

 


  • S1mas likes this

#8 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 12:40 PM

OOOH, just had an idea. There's lots of monitors that are about the same size as a mobile phone... One of those + the cheaper VR glasses that hold a phone... So many things to try, not enough time. I think there might be a few with 4k screens.

EDIT: For under $100, you can get a 1440 X 2560 display kit:

https://www.amazon.c...0?ie=UTF8&psc=1

And a Google Cardboard kit:

https://www.amazon.c...0?ie=UTF8&psc=1

So I am ordering this, will come Saturday. I went with this because all the self-contained 5.5." displays were 1080P and I want to see how well it works. Also, it does exactly what I want (display an HDMI signal) and pretty much nothing more. I can chop up the cardboard for cabling and stuff too.

EDIT: I'll put build instructions here if/when I get this all sorted out. Name for project: ASTROXXORS!


Edited by poserp, 29 October 2020 - 01:17 PM.

  • S1mas likes this

#9 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 01:47 PM

Design brief for pie-in-the sky ASTROXXORS ULTIMATE ROCK:

head-mounted dual camera display with zoom and goto capability. Plate-solving, live image stacking, all self-contained. Raspberry PI 4, high-res display, two mini HDMI cameras (need to figure this part out, if possible) or bring-your-own camera(s), 3D stereoscopic view stiched together in real-time via software. Cross-hairs and other overlays to mix w/input from camera HDMI signals. Real-time star remover/reducer. Other features TBD. Also, onboard mount control, goto, and planetarium overlays/screens.

There's quite a bit of this that can be done right now, only a few bits (the camera/lens combo is the main issue) that need to be worked out.


  • S1mas likes this

#10 futuneral

futuneral

    Apollo

  • *****
  • Posts: 1,371
  • Joined: 27 Dec 2014
  • Loc: Phoenix, AZ

Posted 29 October 2020 - 02:36 PM

Now just need to feed the head tracking data back into the mount and make it point to where you're looking and you have a killer EAA setup!


  • S1mas likes this

#11 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 04:14 PM

Now just need to feed the head tracking data back into the mount and make it point to where you're looking and you have a killer EAA setup!

Yes, this is part of my mega-master plan... I really want this to be AR, where it projects onto glasses, rather than VR, so you can "zoom" on the universe in real-time and stuff.



#12 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 29 October 2020 - 04:52 PM

CYCLOPS. Be afraid, or something:

 

Cyclops

This what I'm going to attempt to use tonight, in all or part (or whatever actually ends up working in the field). Goin' for broke, testing out a bunch of stuff all at the same time. About time to head out and nerd out.



#13 nic35

nic35

    Apollo

  • *****
  • Posts: 1,083
  • Joined: 08 Sep 2007

Posted 30 October 2020 - 09:34 PM

This is so cool.  Here's some random thoughts for a possible next step.

 

The Oculus Quest ( maybe the Go also ? ) can stream wirelessly from a pc to the headset using Virtual Desktop Streamer (https://www.androidc...ur-oculus-quest)

 

I currently run an on-mount Intel Stick, with mount, camera, e-focuser and e-filter wheel and guide camera connected to the PC through a powered  USB 3.0 hub.  The on-mount PC is connected to a stand-alone 5g wi-fi router.  I connect my surface PC to the same router and use it and windows remote desktop to control the on-mount PC.  The only real functionality provided by the surface PC is the mouse, and the inside and warm (winter) bug free (summer) display.

 

Soooo....  if I can find a wireless USB mouse with decent range ( I suspect it would need  be tied into the USB hub on the mount) then I can run everything on the on-mount PC, and watch on the Quest.  Maybe I can use my active USB extension cables to get the mouse dongle close to me !  They haven't been used since I went wireless.

 

I run SharpCap to control the camera (and most everything else through Sharpcap via ASCOM interfaces) and since SC supports zooming, live stacking, and plate solving, it provides all the functionality wished for above - except for the bino-view. 

 

The ultimate solution would, of course, to use the Quest hand controllers to control the  on-mount PC.

 

FWIW, the new Quest 2 also supports a cabled connection to a PC, via USB cable. 

 

There's got to be a problem somewhere with this configuration.  

 

keep on experimenting !  Unfortunately, I'll be headed off for knee surgery soon, so will not get a chance to try this for some time. 

 

john


Edited by nic35, 30 October 2020 - 09:35 PM.

  • S1mas and poserp like this

#14 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 31 October 2020 - 03:52 PM

This is so cool.  Here's some random thoughts for a possible next step.

 

The Oculus Quest ( maybe the Go also ? ) can stream wirelessly from a pc to the headset using Virtual Desktop Streamer (https://www.androidc...ur-oculus-quest)

 

I currently run an on-mount Intel Stick, with mount, camera, e-focuser and e-filter wheel and guide camera connected to the PC through a powered  USB 3.0 hub.  The on-mount PC is connected to a stand-alone 5g wi-fi router.  I connect my surface PC to the same router and use it and windows remote desktop to control the on-mount PC.  The only real functionality provided by the surface PC is the mouse, and the inside and warm (winter) bug free (summer) display.

 

Soooo....  if I can find a wireless USB mouse with decent range ( I suspect it would need  be tied into the USB hub on the mount) then I can run everything on the on-mount PC, and watch on the Quest.  Maybe I can use my active USB extension cables to get the mouse dongle close to me !  They haven't been used since I went wireless.

 

I run SharpCap to control the camera (and most everything else through Sharpcap via ASCOM interfaces) and since SC supports zooming, live stacking, and plate solving, it provides all the functionality wished for above - except for the bino-view. 

 

The ultimate solution would, of course, to use the Quest hand controllers to control the  on-mount PC.

 

FWIW, the new Quest 2 also supports a cabled connection to a PC, via USB cable. 

 

There's got to be a problem somewhere with this configuration.  

 

keep on experimenting !  Unfortunately, I'll be headed off for knee surgery soon, so will not get a chance to try this for some time. 

 

john

I hope your surgery goes well!


Here's what I noted after going out with the kit I posted above:

Tactile controllers are key -- be it a laptop with keys, or a hand controller, or something similar, since you're "in" the goggles and can't see stuff around you (especially out in the field). I don't know which VR headsets have cameras that show the outside world, but those would be potentially useful here if you're using a tablet of some kind to control stuff. Surfing Amazon, there are controllers for security cameras with joysticks to move the camera; something like that combined with the mount would be good for observing and controlling the mount's position.

What I ended up doing was futzing with the mount (basically that lens is too heavy for normal camera mount screws -- it was flopping around on the mount and thus I couldn't establish good tracking), but I did do lots of short exposures. I can view all the screens on my camera through the headset when connected, so I set up the interalometer on-camera to do a bunch of short-ish exposures. This was kind of like stacking, but it'd be better if the camera did the stacking internally so you can watch that happen while it's taking frames. The mount is pretty new to me, so I spent some time figuring out how it works. Views were excellent though, with my particular copy of this lens, however, I think it's best for EAA rather than astrophotography. These lenses are great generally, however I got mine cheap and found out why later -- a previous owner took out the iris and put a T-2 mount on the back (the lens is/was M42). Anyways, because the threading is wrong when using an M42 adapter that throws off the lens a bit, I think, and I got some fairly bad coma in photos. Not as noticeable when observing with the lens, though.

The parts for my ASTROXXORs home-brewed VR goggles arrive today, so I'll be assembling and testing them tonight/tomorrow. I'll post pics of the build and report back on whether or not it actually works. My premise is the Google Cardboard goggles will fit a 4" - 6" phone, therefore a simple display should fit in there too. I'll have to figure out where and how to mount the other components, and probably do some surgery on the cardboard to add ports for the HDMI connector.

 


  • jeremiah2229 likes this

#15 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 31 October 2020 - 10:28 PM

O.k., so I got my screen kit and the cardboard VR goggles. I've assembled the screen and it powers on/works fine. Hooked it up to my laptop, and success, it shows up as a second display. Now, when viewing VR content through the goggles, it needs to be in Side-By-Side format. You can't just put a display in the goggles and have it work. So, I finally (after much google) found this:

https://github.com/PaysPlat/DesktopSbS

You install it and it'll take any display attached to your computer and make into an SBS display. Going to test this out in a few minutes w/the display in the goggles. I'd like to do away with needing a laptop to view the camera's display, but for now I think this might work.

EDIT: It works, or well enough for a v.00001 alpha gaffer-tape-and-glue version at least. Cool, that means I can mount this inside a "real" headset that holds a phone, preferably something with better optics. However, before I get there, I need to figure out how to do the SBS in software/hardware on the headset and not on a laptop.


Edited by poserp, 31 October 2020 - 11:09 PM.


#16 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 02 November 2020 - 04:36 PM

The SBS software is open-source, and .net stuff can, in theory at least, run on a raspberry pi or something similar with .net core, so I'm going to see about porting this to the pi and using the pi with an HDMI dongle on input. This would work somewhat similar to the Playstation VR glasses. The main advantage over those this could have is 1.) resolution, and 2.) cost, but at this point it's very much DIY. IF you have goggles already that accept HDMI input, that is definitely the easier path here. However, I'd like to have a single Raspberry Pi box that can run all the usual things (i.e. image stacking, plate solving, mount control, etc) _and_ provide the SBS display output for the goggles. So in the field you'd have your camera (optionally on a mount), a raspberry pi-sized box, and some goggles, and that's it. It can be as simple or complex as you want to make it, but the aim here is to first support the simplest of set-ups.

The other thing I want to do with the software is get a "real" VR display. The SBS display currently looks kinda like a parabolic mirror when I view it through my goggles. This may be user error, and I'm not out of stuff to mess with, but I'd like a nice, flat view across the whole field of vision. I personally don't find the current view all that distracting, but I'd like to make this as close to awesome as possible...

On an ancillary note, I'm going to test a hand-held gimbal as a sort of pan/tilt control. The gimbal I have available includes a handy x/y controller on the handle, and the handle also has a tripod thread. So I can sit in a chair w/goggles on and use the pan/tilt to control pointing. An alt-az mount w/hand controller could work too, of course, as could any other mount. The gimbal is kinda nice because it has built-in tripod legs, so I can throw that, a camera, the goggles, and the theoretical Raspberry Pi in a backpack and go to a park. Set this up on a picnic table and go to town...

One thing my X-T3 has is "zoom" for focusing. Basically you press a button/dial, and the view will zoom in to a section to allow you to focus better, ostensibly with the built-in LCD screen on the camera. Of course, for the goggles this is great because you have an electronic "zoom" built-in that can work kinda like a barlow. In my initial tests there was no loss in resolution as the output on the camera is 4k and it's already being downsampled to 1080p by the dongle I'm using to hook the hdmi out to my laptop. I'ma test this with planetary viewing later this week when I have the camera on a mount and can track planets. I have a nice Soligor 450 f/8 that's my primary "planet" lens and it ought to work fairly well for this. Will test with the Moon first, which I think will be pretty cool.
 


Edited by poserp, 02 November 2020 - 04:38 PM.

  • S1mas likes this

#17 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 02 November 2020 - 04:43 PM

One neat thing about my particular camera is some of the built-in filters:

http://fujifilm-dsc....lter/index.html

Included in this list are filters that will only show objects of a certain color... I'm curious to try this with, say, the Orion nebula and see if it will show only the "red" parts of the nebula in color as a way to highlight the different parts of it. It'd be cool if that could be assigned to some sort of dial marked in nanometers so you could select specific wavelengths of light in-camera...



#18 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 02 November 2020 - 07:12 PM

Going to cry uncle a bit and get this for parallel development:

https://www.amazon.c...0?ie=UTF8&psc=1

It already runs on Android OS, so I'ma cross my fingers and see if it'll load something like ExploreStars directly. The display is amazing, especially for the price. It has built-in battery, HDMI port on the side, and a bunch of other things that might make enough time/sweat sense to go with this as a "base" for further development. If nothing else, I can sell the PS2 headset and offset most of the cost. At $399 with a native 4k display, it's really hard to beat with something homebuilt in terms of the screen. We shall see how it goes...

So, after further thought I think this headset will be for primary viewing with my camera. The display I just built via DIY will be the testbed for "stereo" imaging using two cameras and two lenses, essentially as large binoculars. I think I can hack this together just by putting the view from each camera/lens side-by-side on a screen, then using the headset to view that screen over HDMI. Anyways, we shall see...


Edited by poserp, 02 November 2020 - 09:32 PM.

  • S1mas likes this

#19 futuneral

futuneral

    Apollo

  • *****
  • Posts: 1,371
  • Joined: 27 Dec 2014
  • Loc: Phoenix, AZ

Posted 03 November 2020 - 12:22 AM

Not to throw a wrench into this whole thing, but are you sure the camera->HDMI route is the right one? I believe this could limit the applicability to a very niche set of devices (both cameras and goggles) but also may not be the best experience-wise. 

Wouldn't a better long term approach be to fetch data from camera via some common drivers (ASCOM or INDI) and then stream images to the headset. Not only this would open this for more devices, but it could also provide more flexible configurations.

It could be as simple as Camera->Raspberry Zero W->Web server  ==wifi==> Android device (with a browser) in a carboard-like headset. 

Or as sophisticated as Camera->PC(or raspi)->Custom streamer app ===wifi===> Oculus/Vive/PSVR headset with a custom 360 degree full VR experience app with head tracking and controllers.

Your approach is great as a proof of concept, but looks like it's becoming a bit expensive (both in time and money).



#20 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 05 November 2020 - 08:06 PM

Not to throw a wrench into this whole thing, but are you sure the camera->HDMI route is the right one? I believe this could limit the applicability to a very niche set of devices (both cameras and goggles) but also may not be the best experience-wise. 

Wouldn't a better long term approach be to fetch data from camera via some common drivers (ASCOM or INDI) and then stream images to the headset. Not only this would open this for more devices, but it could also provide more flexible configurations.

It could be as simple as Camera->Raspberry Zero W->Web server  ==wifi==> Android device (with a browser) in a carboard-like headset. 

Or as sophisticated as Camera->PC(or raspi)->Custom streamer app ===wifi===> Oculus/Vive/PSVR headset with a custom 360 degree full VR experience app with head tracking and controllers.

Your approach is great as a proof of concept, but looks like it's becoming a bit expensive (both in time and money).

Yes to all of that, of course, but yeah this is POC and I'm selling stuff I don't need to fund it. So far there are two paths that are different:

1.) high-quality VR goggles - camera HDMI out. This is the first thing I've tried, I want to make it as simple as possible and see what can be done. Two devices, maybe a tripod, and that's it. It costs more money if you don't have a camera with HDMI out, but if you already have that then what's the quickest path? Just want to see what can be done. So far, I've found it's most useful for checking focus and live ad-hoc "point the camera at the sky" sort of observing.

2.) stereo EAA w/stacking. I just got all the bits I need for this -- two lenses, two of the ZWO 1.2MP color cameras, SharpCap, and cheap build-yourself goggles with a screen. The kit looks like this:

 

IMG 1588[1]
IMG 1587[1]


I've tried it indoors and it works great -- don't need to futz around with SBS, literally open two instances of SharpCap and size and align images on the screen in the goggles so that everything looks nice and 3d. It takes some focusing/zooming/messing with settings to get it right, and I'll see in a couple of hours how much of a pain it is to do this outside while looking at stars. Early-stage POC, but so far everything works as planned.

Of course at some point both of these need to be "hardened" so that everything works well together, there isn't extraneous equipment, and there's some general idea (at least) of cost. The two lenses I'm using are Fujinon TV zooms, each was about $60 on Ebay. They're already c-mount, so adapting to the ZWO is straightforward (just need to add a cs to c spacer ring). They are zoomable, with a range from 12.5 to 75 mm. The lenses are  f/1.2, so plenty bright and with a 2x teleconverter that drops to f//2.4 with a 150mm focal length at the long end. For now I'ma hold up on the teleconverter part (I have one, will get a second soon-ish) just to make sure it all works and is usable for viewing sky stuff. My first test tonight is going to be locally in my backyard, next week I'll take it on the road to my "observatory". Also, once I get everything tested and build out the goggles better with proper cabling (mine now are pretty short), I'll mount it on the iExos 100 and test that whole thing out. I don't know if it's too wide for that yet, but I'll find out soon.

Basic BOM at the moment w/prices is something like $300 for the cameras, $120 or so for lenses, and $90 for the goggles plus screen. Add laptop to taste. Rigging costs depend on what you use, I think here I'm somewhere in the neighborhood of $100 or so (i.e. all the stuff to mount the lenses together with baseplate to attach it to the tripod).

Edited by poserp, 05 November 2020 - 08:09 PM.

  • S1mas and futuneral like this

#21 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 06 November 2020 - 12:54 AM

Watching the moonrise in 3d is fun... I mean, you can also do that with regular binoculars. The moon is a good target because it's easier to align the images in the goggles to get a good 3d perspective. There's some high clouds and other things (like all the moonlight) that are making 3d viewing of things like Andromeda hard at the moment. Also, with these cameras I think darker skies are going to be key. Using SharpCap I can tweak a lot of things, though, so perhaps once I get everything dialed in this particular combo of camera/lens will work out better. The Fuji X-T3 sensor is a champ when it comes to low-light, but that's too expensive an option IMO to double up on DSLRs. I'ma save that for this weekend when I can try hi-res mono viewing with the incoming higher-spec'ed goggles.

There are a couple of things that'd be cool. One would be tactile control over various parameters in SharpCap, and some way to gang that across multiple instances. For instance, I can open two instances of SharpCap, assign each instance its own camera, then match them up by fine-tuning gain/histogram/white balance/etc. Once they're calibrated I'd like to have a knob for shutter speed, one for gain (that gangs the sliders and moves them without changing their relative positions), and perhaps one for frame rate. Oh, and one for zoom level. I'll have to dig into it a bit, but something like MIDI control over those things would work 'cause then you can use any device that sends MIDI to change them. Or some other similar controller communication protocol, like OSC. Another that'd be neat is a way to save relative window sizes and positions. I imagine there's something like this in a third-part windows manager of some kind. Of course having a SBS "view" in SharpCap and the ability to connect to multiple cameras at once would be ideal.

EDIT: I bet I can find a MIDI-to-random-input mapper of some kind out there, and if not for MIDI then probably for OSC. This would be cool too with ExploreStars to assign a physical controller with a joystick to that x/y pad for moving the scope. As I write this I'll bet there's some sort of ASCOM thing for that already...

EDIT2: A pan/tilt tripod head >>> ball head for moving the "eyes" around, as that would maintain the relative positioning of the cameras w.r.t. the roll axis (having one higher/lower than the other throws off how the images line up and you have to move the windows around a bit to re-establish a good "3d" view). The ZWO cameras need to be about perfectly aligned in their orientation also as the image rotates when the sensor rotates.


Edited by poserp, 06 November 2020 - 01:55 AM.

  • S1mas likes this

#22 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 06 November 2020 - 07:57 PM

Going to cry uncle a bit and get this for parallel development:

https://www.amazon.c...0?ie=UTF8&psc=1

It already runs on Android OS, so I'ma cross my fingers and see if it'll load something like ExploreStars directly. The display is amazing, especially for the price. It has built-in battery, HDMI port on the side, and a bunch of other things that might make enough time/sweat sense to go with this as a "base" for further development. If nothing else, I can sell the PS2 headset and offset most of the cost. At $399 with a native 4k display, it's really hard to beat with something homebuilt in terms of the screen. We shall see how it goes...

So, after further thought I think this headset will be for primary viewing with my camera. The display I just built via DIY will be the testbed for "stereo" imaging using two cameras and two lenses, essentially as large binoculars. I think I can hack this together just by putting the view from each camera/lens side-by-side on a screen, then using the headset to view that screen over HDMI. Anyways, we shall see...

These goggles, alas, won't work. I didn't realize until I got them that there's no HDMI input (it's a micro USB on the side for charging). NUTS! Sorry I didn't research that further. I've returned them, will get some other thing to try if I can find something else that can do 4k (and not just 1080p60).


  • jeremiah2229 likes this

#23 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 08 November 2020 - 07:18 PM

Where I'm at I think observation nights will be limited for at least a week or two, so the observation bits of this project will slow down accordingly. I'm not put off by cold, but clouds aren't transparent and/or I'm not cool enough yet to figure out daytime EAA (actually on my long-term todo list...) of anything except the sun and moon.

That being said, if you have an Android phone with a USB-C port, this might be interesting if you want to try this on the cheap. Basically I found this article online about converting a phone into an HDMI monitor with a cheap cable and an adapter dongle:

https://www.diyphoto...n-hdmi-monitor/

This could then be used for either of the things I'm trying. The "view the sky on a large screen" version is actually more difficult, as I've only found one VR headset that doesn't require any software to kinda/sorta view a screen in goggles in 2d and have it look right. I'm half-tempted to work out the optics and lenses for this myself as I have a bunch of random things lying around and I'm not too afraid of the math involved (whether I'm good at it remains to be seen...).

For the 3D case it's straightforward, and basically the same thing I'm doing with my home-built goggles. Two cameras, two lenses, put each feed in its own window and then position on the screen in a headset until things look "3D".

I've found an SDK online for remote control of the Fuji X-T3 that I'm thinking I'm gonna play with to get familiar with what can be done for that scenario. The official Fuji app can't work in conjunction with the HDMI output, so I'm going to run some tests to see what I can figure out. Being able to a.) control the camera remotely while b.) viewing the HDMI output is the key to integrating the two. Since at least some VR headsets run on Android OS, I figure I can dust of my very very rust C++ skillz and maybe get something kinda-sorta working on Windows first and then, perhaps, on Android. IF I can figure out the details of how things work I might bust it out into Javascript or something similar to make the code portable. Also, may try to figure out how to do all of this in conjunction with ASCOM. It's a long-term, work-on-it-when-I-don't-hate it sort of thing... Anyways, I'ma go into coding mode for at least the next week on this and see where things end up.

I'd also like to do some coding around a VR headset "container" window where you can assign HDMI video sources A and B to eyes A and B, and it'll just work. Also, being able to assign the same HDMI video source to A and B for a higher-res 2D view. So, I am off to nerd out.


  • GilATM and S1mas like this

#24 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 09 November 2020 - 12:09 PM

Where I'm at I think observation nights will be limited for at least a week or two, so the observation bits of this project will slow down accordingly. I'm not put off by cold, but clouds aren't transparent and/or I'm not cool enough yet to figure out daytime EAA (actually on my long-term todo list...) of anything except the sun and moon.

That being said, if you have an Android phone with a USB-C port, this might be interesting if you want to try this on the cheap. Basically I found this article online about converting a phone into an HDMI monitor with a cheap cable and an adapter dongle:

https://www.diyphoto...n-hdmi-monitor/

This could then be used for either of the things I'm trying. The "view the sky on a large screen" version is actually more difficult, as I've only found one VR headset that doesn't require any software to kinda/sorta view a screen in goggles in 2d and have it look right. I'm half-tempted to work out the optics and lenses for this myself as I have a bunch of random things lying around and I'm not too afraid of the math involved (whether I'm good at it remains to be seen...).

For the 3D case it's straightforward, and basically the same thing I'm doing with my home-built goggles. Two cameras, two lenses, put each feed in its own window and then position on the screen in a headset until things look "3D".

I've found an SDK online for remote control of the Fuji X-T3 that I'm thinking I'm gonna play with to get familiar with what can be done for that scenario. The official Fuji app can't work in conjunction with the HDMI output, so I'm going to run some tests to see what I can figure out. Being able to a.) control the camera remotely while b.) viewing the HDMI output is the key to integrating the two. Since at least some VR headsets run on Android OS, I figure I can dust of my very very rust C++ skillz and maybe get something kinda-sorta working on Windows first and then, perhaps, on Android. IF I can figure out the details of how things work I might bust it out into Javascript or something similar to make the code portable. Also, may try to figure out how to do all of this in conjunction with ASCOM. It's a long-term, work-on-it-when-I-don't-hate it sort of thing... Anyways, I'ma go into coding mode for at least the next week on this and see where things end up.

I'd also like to do some coding around a VR headset "container" window where you can assign HDMI video sources A and B to eyes A and B, and it'll just work. Also, being able to assign the same HDMI video source to A and B for a higher-res 2D view. So, I am off to nerd out.

So, despite the fun that happens when you try to search  "duplicate windows on windows" and similar, I found MurGeeMon, a free utility that lets you do a bunch of stuff. One thing it does (I think...) is clone desktop windows, so I'ma try that with a simple webcam app that will work with my Fuji natively via USB-C instead of using the HDMI output. I'll also try it with the HDMI output. The basic goal here is to duplicate the video feed from a camera and assign that to each eye, or duplicate a single SharpCap window for each eye so that I can do the "2D" version of this on any screen. If it works out well then, at least with a Windows-based laptop and/or tablet, one should be able to do both "2D" and "3D" viewing with a homemade headset that uses a single screen or any of the other single-screen options I've covered in this thread. Will report back if successful or not. I wonder if it can also save/restore app and window configurations?

 

EDIT: That's available here -- https://murgee.com/


Edited by poserp, 09 November 2020 - 12:15 PM.

  • S1mas likes this

#25 poserp

poserp

    Sputnik

  • -----
  • topic starter
  • Posts: 26
  • Joined: 30 Sep 2020

Posted 16 November 2020 - 05:52 PM

Two quick things:

1.) MurGeeMon isn't working for me yet, perhaps I'm not seeing the thing I need to click. I've also tried a paid package called Actual Multiple Monitor. This is cool 'cause I can "clone" any application window and use it for 2D viewing in a headset. It wasn't made for this purpose, so I can't adjust stuff like barrel distortion and other things to improve the image, but it proves the concept.

2.) I have read that the Oculus Quest 2 will support certain USB-C capture cards like the Elgato Cam 4k. So this is the next "platform" I'm going to try. This is nice because it would be headset, dongle, hdmi cable, and camera, and that's it for the most basic set-up, if it works. I don't know how far I'll get, though, especially if I have to be around WIFI just to use the headset, but I'll take the plunge as Amazon returns are very easy to do (I can drop things off at a local store) and it only takes an hour or so to discover any "showstoppers". Also, the Quest 2 can load apps so there's an actual development path to get stacking/plate solving onto the headset itself. Even with the WIFI verification, if that's an issue, I'll bet I can hack together a work-around to "jailbreak" the headset, but shhh...


Edited by poserp, 16 November 2020 - 05:55 PM.

  • GilATM, nic35 and davidparks like this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics