Posted 19 August 2019 - 05:52 PM
Posted 19 August 2019 - 08:20 PM
My wife was good enough to let me replace my Astro use 2006 50" Plasma Monitor with a 50" 4k TV (my iMac can support 2 of these)...not sure if it would meet your requirements...Pat Utah
Edited by Alien Observatory, 19 August 2019 - 08:21 PM.
Posted 20 August 2019 - 07:00 PM
Anyone using an HDR TV with their EAA setup? I'm using an older flatscreen. Wondering if I should upgrade or if it's money down the drain.
Keep your money for now. Here's some details:
what the label 'HDR' means-not much yet.
#1 'HDR' is currently in the wave of terms being standardized.
Articles on the web are quickly becoming outdated so google searches don't give the best current info.
Interestingly, one common 'standard maker' is rising, and that ironically is Netflix as far as source content.
HLG is HDR 'light' and DolbyVision is trying to hard to be proprietary.
Do you remember when TV's were called "HD" with 720x1280 resolution alongside 1920x1080 TV's?
Same thing with real high dynamic range TV's.....
Which are now only $10,000 hahahaha.
Consider store-bought 'HDR' tv's as the '720p' of old. 'HDR light' if you will.
#2 Not many things are outputting 10bit which is necessary to have displays (and even your OS) enter "HDR" mode.
Second thing to consider is how your computer outputs 10bit if it does at all. NVidia can do it with games,
but only a recent (this month) driver is now allowing 10bit output from Photoshop. I doubt SharpCap is supporting 10bit output with a 10bit GPU out to a 10bit HDR display....
Consider plugging a VCR into a 4K tv, but it doesn't upscale well. Same with HDR. Computers at the moment aren't driving 10bit for the most part except for some games and some pro video output cards (like Blackmagic4K mini monitor and software that supports it)....
Many consumer "HDR" TV's have brightness settings that allow you to dial the brightness back VERY MUCH!!!! My Samsung 40" (smaller 1/2 pretend HDR I could find) for it's large size can show an image with a softer glow total light than all laptops on their lowest setting....
Not only is this helpful to night vision/ astro neighbors, but turning off sound and a dim backlight allows much less power draw than other TV's/laptops...
Checking most of the boxes, I was able for fun to get my TV to run (10bit) semi-HDR (~500nit) level out of a BMD card with Resolve feed by my 4K camera,
And was able to resolve bright Jupiter and its dim moons with gear/FL when other means (8bit non HDR) would never allow me to show such at the same time.....
But non 1000nit/5000nit HDR is still gimmicky and doesn't display dazzling HDR like the super expensive ones can right now.
Beware of store demos that are mostly just over-saturated colors (color often is perceptual) So they can do much with "optical illusions" to wow you in a limited setting:. They would fail visually if they were set next to the pro Sony, Flanders or DolbyVision true HDR tv's.....
Some day, with software, computer hardware, and the displays are cheap enough,
HDR's gift to EAA will be amazing,
since dynamic range is one key difference the eyepiece still wins with (if you had a scope that could compete with integration stacking EAA).....
now it's more a 'slightly better' gimmick except for the lowpower/low light application I mentioned above.....
Keep your eye on Netflix though,
because they may ironically be the biggest trend setter as far as which final standards get adopted on a ubiquitous scale (remember VHS v Beta or HDDVD v BluRay????)
Edited by t_image, 20 August 2019 - 07:02 PM.
- ccs_hello, mclewis1 and 39.1N84.5W like this