Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

What resolution is your desktop? And do you work in an HDR color space?

  • Please log in to reply
7 replies to this topic

#1 Borodog

Borodog

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1,749
  • Joined: 26 Oct 2020

Posted 11 April 2021 - 06:17 PM

I am considering upgrading my monitor for lunar image processing/enjoyment. My laptop is fairly old, so while it has USB3.0, the internal graphics card does not support 4K. I have looked into it and you get get USB3.0 4K, 60 Hz external graphics adapters fairly cheap. Since I won't be playing games or doing any 3D graphics on this machine I don't anticipate the low bandwidth and high latency of USB3.0 to be much of a problem. The increase in screen resolution would be a huge upgrade over my current HD screen.

 

However, I have been unable to find any of the USB3.0 4K adapters that supports HDR10 (High Dynamic Range 10). That means 10 bits per channel color instead of 8. That's 1024 gradations of color instead of 256, for a total of over 1 billion displayable colors. The biggest difference is in color banding, and for me it would allow higher gamma corrections without blowing out highlights, which is something your eye does naturally. So essentially it would allow brighter images of the Moon without relative loss of color resolution in the highlights. I know the difference is probably subtle, but I would really prefer it if I could find one. And yes, I am aware I would have to process images differently to share, as most people would be viewing on 8 bits per channel displays.

 

My plan is to get an 8 bpc 4K adapter and a 4K monitor that supports HDR10. Then when (hopefully) a new USB3.0 graphics device comes out that supports HDR10, I can grab one of those. Or maybe by that time my laptop will have given up the ghost finally and I'll get something that natively support it.

 

So, I'm basically just looking to have a discussion on your lunar processing workstation resolution, color depth, and whether you think HDR10 would make as big an impact as I hope it might. Oh, and whether you know of any USB3.0 4K external graphics adapters that support HDR color depth.

 

For what it's worth there are also 12 bit per channel displays, for 4096 gradations of color per channel and 64 billion colors, but that is probably overkill and way beyond my budget.



#2 sc285

sc285

    Mariner 2

  • -----
  • Posts: 245
  • Joined: 25 Jun 2008
  • Loc: Northeastern KS

Posted 11 April 2021 - 07:28 PM

I may be wrong, but the resolution is dependent on scope size and camera resolution.

The monitor is not going to improve or enhance anything. That is done in the acquisition of the

image and processing. No matter what size monitor or specs it may have it would look the same on any monitor.

You can't take a 1080i image and put it on a 4K monitor expecting that it is going to look better. It will still be 1080i

Same process when trying to duplicate VCR tapes to DVD. It still 480i video tape quality.

But then I said....I may be wrong.


Edited by sc285, 11 April 2021 - 08:18 PM.

  • Borodog likes this

#3 Borodog

Borodog

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1,749
  • Joined: 26 Oct 2020

Posted 11 April 2021 - 08:23 PM

Except that 4K monitors are 8 Mpxl and HD monitors are 2 Mpxl. So you can see 4X more area of any given image and it looks 2X sharper at the same image scale for the same size monitor. I would be replacing my 32" HD monitor with a 32" 4K monitor.



#4 Tom Glenn

Tom Glenn

    Gemini

  • -----
  • Posts: 3,352
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 April 2021 - 01:41 AM

The biggest difference is in color banding, and for me it would allow higher gamma corrections without blowing out highlights, which is something your eye does naturally. 

Generally speaking, if your monitor is only mediocre, then upgrading to a higher end monitor will increase your viewing enjoyment, and by extension, your processing experience.  However, the specific aspects you mention above, banding and highlight detail, are not issues that are typically caused by the monitor.  

 

Color banding (and grayscale banding) can be caused by a multitude of factors.  Some images are particularly prone to banding (solid colors, blue sky, etc), but in most cases, banding in astrophotos is effectively hidden by noise, and so if you are seeing banding in your final images, this is usually a result of processing and not the monitor.  An 8 bit monitor will be sufficient to display about 11 stops of dynamic range, thanks to gamma encoding.

 

Assuming an image is not overexposed (no fully saturated pixels), then "blowing out" the highlights is caused by having significant regions of the image that have approached close enough to white such that we cannot appreciate any detail.  You don't need to have any saturated pixels in order for a region of highlight detail to be lost.  Our eyes are not very good at differentiating bright tones.  We are much better at differentiating dark tones (hence the annoyance of banding in shadow regions).  But having a high bit depth monitor won't help with highlight retention.  A region of bright pixels is still very bright, and devoid of detail, whether you have an 8 bit, 10 bit, or 12 bit monitor.  What you will find with high bit depth monitors, however, is that subtle shadow regions that would all appear nearly black on an 8 bit monitor will have somewhat more texture in higher bit depth monitors.  This is all based on anecdotal evidence however, as I don't own a higher bit depth monitor.  

 

I started off doing all my processing on my MacBook Pro, which has a retina display with 220 pixels per inch.  This is an excellent monitor, although it is somewhat small, and so now I am using an external Dell monitor, 27 inch 4K, although it is one of the cheaper models from several years ago.  I find it to be excellent, despite the fact that it is only 163 pixels per inch, which is lower resolution than the MacBook Pro display.  The larger size makes this difference seem trivial, and looking at large image of the Moon is more enjoyable.  Because your monitor is large at 32", the difference in 4K resolution will be quite noticeable, at least if you are looking at images large enough to support that resolution at 100% scale.  The extra megapixels do matter here.  However, 10 bit capability would not be something I would consider essential, unless you are going to be preparing images with the intention of always displaying them on a 10 bit monitor.  Although it is extremely important to process images with high bit depth (hence the 16 bit or even 32 bit output files from stacking software), ultimately all images that are intended for public consumption are reduced to 8 bits for web display, rendering your 10 bit monitor useless for that purpose.  Although you may find a 10 bit monitor more enjoyable (you would have to test it and let us know), I would reiterate my caution above that if you are seeing significant banding in your images, this is a problem that won't be solved by getting a higher bit depth monitor.  And again, highlight retention has nothing to do with the bit depth (it's deep shadow detail that is affected here).  But you will absolutely see the benefits of having a higher resolution display.  


  • Kenny V. and Borodog like this

#5 airscottdenning

airscottdenning

    Viking 1

  • *****
  • Posts: 555
  • Joined: 22 Aug 2008
  • Loc: Colorado

Posted 12 April 2021 - 03:52 PM

I use a 4K monitor on my desktop. It's nice for astronomy, but the best part is just less eye fatigue and a more relaxed life in these COVID times.  It's SO MUCH easier on my aging eyes than my old monitor -- not just for astro, but for web browsing, email, everything!


  • Borodog likes this

#6 Borodog

Borodog

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1,749
  • Joined: 26 Oct 2020

Posted 12 April 2021 - 06:44 PM

Gibberish deleted.


Edited by Borodog, 12 April 2021 - 07:51 PM.


#7 Borodog

Borodog

    Surveyor 1

  • -----
  • topic starter
  • Posts: 1,749
  • Joined: 26 Oct 2020

Posted 12 April 2021 - 07:50 PM

Tom,

 

Reading the wikipedia article on gamma correction makes me realize I have little idea what I'm talking about, so feel free to ignore what I just posted.

 

https://en.wikipedia...amma_correction


Edited by Borodog, 12 April 2021 - 07:50 PM.


#8 Tom Glenn

Tom Glenn

    Gemini

  • -----
  • Posts: 3,352
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 13 April 2021 - 02:08 AM

Tom,

 

Reading the wikipedia article on gamma correction makes me realize I have little idea what I'm talking about, so feel free to ignore what I just posted.

 

https://en.wikipedia...amma_correction

Mike, I had briefly read your post before you deleted it, but I won't elaborate on those points since they are moot.  What I can say, however, is that gamma is a confusing topic, and it's very easy to make a misstatement.  In fact, I have numerous inaccurate statements in some of my posts dating back several years ago, because I had not fully appreciated everything about gamma.  In particular, I found it very enlightening to learn more about what happens "in camera" when an image is processed by a DSLR or mirrorless camera.  Suffice it to say that any "raw" images that are processed by terrestrial photographers are not actually raw, but have already been subjected to significant processing .  This includes both gamma correction and additional nonlinear curves added by the raw processing engines (either in camera or software).  Most people outside of astrophotography, or scientific imaging, never encounter linear images that require editing.  The primary benefit of gamma encoding is that smaller numbers of bits can be used to represent a larger fraction of useful tonal space.  If no gamma encoding was performed, then significant bit depth would be wasted on bright tones that we simply cannot appreciate.  This is what allows us to share 8 bit images and use 8 bit monitors and have them look good in most cases.  

 

One other issue that is often confused is the response of the human eye.  The human eye has huge dynamic range capabilities, but not when using a fixed gaze and pupil size.  It is surprisingly hard to get definitive data on the dynamic range of the human eye, because the eyes are not static, and neither is the image processor (our brains).  Nevertheless, most estimates for the dynamic range of the eye during a fixed gaze fall somewhere between 10-14 stops.  So, this is not much different from a modern camera sensor.  Where the eye excels is its ability to quickly adapt to different lighting conditions, so by adjusting your gaze you can easily surpass 14 stops of dynamic range, and this is what will be impossible to recreate in a still image, even with high bit depth monitors.  But as I've said in some other threads, the Moon doesn't have the dynamic range that many people seem to think it has.  Looking at some measurements from my own images, there is only about 10-12 stops of exposure separating the illuminated limb from the "Earthshine" portion during a crescent Moon.  This is consistent with our ability to easily see Earthshine visually, given the above estimates of 10-14 stops static dynamic range of the eye. 


  • John_Moore and Borodog like this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics