Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Loss of color diversity in LRGB photography when the filters do not overlap

  • Please log in to reply
292 replies to this topic

#276 loujost

loujost

    Mariner 2

  • -----
  • topic starter
  • Posts: 243
  • Joined: 02 Oct 2017

Posted 23 April 2024 - 05:28 PM

You can already do this with an OSC. We recently proved we could color match with two astrocams—the ASI224MC and the ASI533MC—to very low E. All you need is a UV/IR cut filter. We even got the True Colors of the planets using the lunar regolith to model and remove red shift from atmospheric absorption…

 

https://www.cloudyni...th-an-asi224mc/

 

Take your astrocam, plug in the UV/IR cut, an shoot a CC24 under D65 lighting, and Bob's your uncle.

 

BQ

Absolutely! Very nice, I missed that thread. I was only suggesting that if one wanted to also include more data for some reason, such as UV and IR, a monochrome camera would be able to do that. There is also a resolution benefit as mentioned by slippy



#277 slippy

slippy

    Explorer 1

  • *****
  • Posts: 66
  • Joined: 10 Sep 2021
  • Loc: Seattle

Posted 23 April 2024 - 05:31 PM

Sounds interesting, although I don’t understand that for the most part! It’s still a bit difficult for me to break out of the thought that color is anything other than RGB 0-255. All of the other encoding schemes, white balance, gamma, luminance, etc are lost on me in terms of how they relate to an integer on a colored pixel. But that’s ok, I can acknowledge the basic principle.

 

That just made me realize something though - it works with OSC because that takes a snapshot of all channels simultaneously, and that gives you the relative values of each at a point in time. But if you’re taking long exposures of varying length and in different conditions, hours or days apart, doesn’t that kind of go out the window? It seems like you’d need a common reference to normalize them. Or can something like SPCC provide that? You can’t sneak a color checker into the shot, and can’t really shoot it under the same conditions that you’ll be using for your color data, but if stars have known reference colors, does that fix it?



#278 badgie

badgie

    Viking 1

  • *****
  • Posts: 643
  • Joined: 13 Jan 2021
  • Loc: SF Area, California

Posted 23 April 2024 - 05:33 PM


Anyway, as for the advantage, OSC probably already does a decent job of this. However, it’s inherently lower resolution due to the bayer matrix, and since most people already have and prefer mono cams, it would be a way to use those similar to how they use them for RGB data today, just with a bit more spectrum coverage. But because they’re all curves, it would still be less efficient than RGB.

 

I’m still not sure I’d bother going the tristimulus route anytime soon, just exploring ideas.

RE resolution CFA drizzle handles this so not a major impact on resolution or integration time, except for 2x on green compared to red.  But as a a mono camera user I catch you point!



#279 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 9,139
  • Joined: 29 Apr 2017
  • Loc: Nova, USA

Posted 23 April 2024 - 06:25 PM

Sounds interesting, although I don’t understand that for the most part! It’s still a bit difficult for me to break out of the thought that color is anything other than RGB 0-255. All of the other encoding schemes, white balance, gamma, luminance, etc are lost on me in terms of how they relate to an integer on a colored pixel. But that’s ok, I can acknowledge the basic principle.

 

That just made me realize something though - it works with OSC because that takes a snapshot of all channels simultaneously, and that gives you the relative values of each at a point in time. But if you’re taking long exposures of varying length and in different conditions, hours or days apart, doesn’t that kind of go out the window? It seems like you’d need a common reference to normalize them. Or can something like SPCC provide that? You can’t sneak a color checker into the shot, and can’t really shoot it under the same conditions that you’ll be using for your color data, but if stars have known reference colors, does that fix it?

 

sRGB is indeed a set of color vectors with components on the interval [0,1]. But visual-matched color requires negative values of sRGB because the sRGB gamut is a crop of the visible color horseshoe. No amount of white balance will fix this—only a CCM on a set of filters that match the reasonably match the Luther-Ives condition will do so. Even then, there is no display that can display all the visible colors.

 

SPCC only provides the white balance scale factors for the three sRGB channels to achieve some white reference. But the white reference for AP is arbitrary—read Section 7.3 of the PI RTFM.

 

Even B-V is arbitrary—not to mention, white is in the eye of the beholder

 

BQ


Edited by BQ Octantis, 23 April 2024 - 06:31 PM.


#280 Borodog

Borodog

    Voyager 1

  • *****
  • Posts: 11,134
  • Joined: 26 Oct 2020
  • Loc: St. Augustine, FL

Posted 24 April 2024 - 02:53 PM

And another who completely misunderstands the issue…

Oh I understand the issue quite well. I've been saying for a long time that the loss in effective color bit depth from using non-overlapping square band pass RGB filters is almost as bad as shooting L, which is why LRGB images have a certain "only primary colors need apply" look to them that you can tell immediately. I'm just saying that people that get hung up on supposed "accuracy" in AP are barking up the wrong tree. There is virtually nothing remotely "accurate" about literally any part of AP.



#281 loujost

loujost

    Mariner 2

  • -----
  • topic starter
  • Posts: 243
  • Joined: 02 Oct 2017

Posted 24 April 2024 - 03:28 PM

Oh I understand the issue quite well. I've been saying for a long time that the loss in effective color bit depth from using non-overlapping square band pass RGB filters is almost as bad as shooting L, which is why LRGB images have a certain "only primary colors need apply" look to them that you can tell immediately. I'm just saying that people that get hung up on supposed "accuracy" in AP are barking up the wrong tree. There is virtually nothing remotely "accurate" about literally any part of AP.

Well, that is a choice people make for themselves. One can choose accuracy sometimes.



#282 Borodog

Borodog

    Voyager 1

  • *****
  • Posts: 11,134
  • Joined: 26 Oct 2020
  • Loc: St. Augustine, FL

Posted 24 April 2024 - 03:45 PM

Well, that is a choice people make for themselves. One can choose accuracy sometimes.

They could. But they don't. Because nobody would like the results. Because what astronomical objects actually look like to the human eye doesn't actually look that good, if they're even visible at all.

 

As an example, the only people I've ever seen go to the trouble to process a planetary image the way it would look in the telescope are Tulloch and BQ Octanis, and not offense to them, but it looks like washed out hot garbage.

 

Long integration times? Right out. The human eye has an effective integration time of about 0.1s.

 

Super hard stretches? Right out. Better stick with gamma = 2.2 for accuracy.

 

All that pretty red Hα? Practically gone, its nearly invisible to humans (that's why DSLRs filter almost all of it out).

 

All those pretty colors? Gone. Almost all of these objects, if you can see them in the telescope at all, are monochromatic to the human eye.

 

The only exceptions that I know of are some kinds of lunar imaging and white light solar.



#283 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 9,139
  • Joined: 29 Apr 2017
  • Loc: Nova, USA

Posted 24 April 2024 - 04:02 PM

The only exceptions that I know of are some kinds of lunar imaging and white light solar.

Nope. If you output a linear (i.e., gamma compressed to accomodate the gamma correction of the display driver), color-correct (i.e., corrected for atmospheric yellowing for the given target altitude) image, they, too, are washed out hot garbage.

 

BQ


Edited by BQ Octantis, 24 April 2024 - 04:03 PM.


#284 slippy

slippy

    Explorer 1

  • *****
  • Posts: 66
  • Joined: 10 Sep 2021
  • Loc: Seattle

Posted 24 April 2024 - 04:03 PM

They could. But they don't. Because nobody would like the results. Because what astronomical objects actually look like to the human eye doesn't actually look that good, if they're even visible at all.

 

As an example, the only people I've ever seen go to the trouble to process a planetary image the way it would look in the telescope are Tulloch and BQ Octanis, and not offense to them, but it looks like washed out hot garbage.

 

Long integration times? Right out. The human eye has an effective integration time of about 0.1s.

 

Super hard stretches? Right out. Better stick with gamma = 2.2 for accuracy.

 

All that pretty red Hα? Practically gone, its nearly invisible to humans (that's why DSLRs filter almost all of it out).

 

All those pretty colors? Gone. Almost all of these objects, if you can see them in the telescope at all, are monochromatic to the human eye.

 

The only exceptions that I know of are some kinds of lunar imaging and white light solar.

Color is orthogonal to exposure, and you’re wrong about visible wavelengths, so, much of that is actually not true at all, but more importantly, it’s besides the point and off topic.



#285 loujost

loujost

    Mariner 2

  • -----
  • topic starter
  • Posts: 243
  • Joined: 02 Oct 2017

Posted 24 April 2024 - 04:52 PM

Color is orthogonal to exposure, and you’re wrong about visible wavelengths, so, much of that is actually not true at all, but more importantly, it’s besides the point and off topic.

There is a long thread on natural color here:

 

https://www.cloudyni...ue-color-again/

 

That would be a more appropriate place for this subject. The things you have mentioned have been addressed there, for the most part.



#286 SeymoreStars

SeymoreStars

    Skylab

  • *****
  • Posts: 4,408
  • Joined: 08 May 2014
  • Loc: Pennsyltucky

Posted 24 April 2024 - 05:19 PM

I haven't had time to read the whole thread. Have the "true" color police showed up yet?

 

I have both Chroma RGB and "classic" RVB filters. Which would produce the best result on an RGB subject?

 

 

Attached Thumbnails

  • Screenshot 2024-04-24 181434.png
  • Screenshot 2024-04-24 181227.png


#287 loujost

loujost

    Mariner 2

  • -----
  • topic starter
  • Posts: 243
  • Joined: 02 Oct 2017

Posted 24 April 2024 - 05:26 PM

I haven't had time to read the whole thread. Have the "true" color police showed up yet?

 

I have both Chroma RGB and "classic" RVB filters. Which would produce the best result on an RGB subject?

The second set would be the better of the two, to address the loss-of-color-diversity issue discussed in the present thread.

 

If you really care about the issue of true color, please read the other thread.



#288 SeymoreStars

SeymoreStars

    Skylab

  • *****
  • Posts: 4,408
  • Joined: 08 May 2014
  • Loc: Pennsyltucky

Posted 24 April 2024 - 05:31 PM

The second set would be the better of the two, to address the loss-of-color-diversity issue discussed in the present thread.

 

If you really care about the issue of true color, please read the other thread.

I have no desire to dicuss "true" color. LOL



#289 slippy

slippy

    Explorer 1

  • *****
  • Posts: 66
  • Joined: 10 Sep 2021
  • Loc: Seattle

Posted 24 April 2024 - 05:42 PM

The first set would not be able to take pictures of the sodium tail on some comets or mercury, the second would. In this sense it’s not even an accuracy issue, it simply can’t even capture it because of that hole between filter bands. The second set would have weak response to Ha, Nii, Sii, and some other less common lines, but it would still see it.



#290 loujost

loujost

    Mariner 2

  • -----
  • topic starter
  • Posts: 243
  • Joined: 02 Oct 2017

Posted 24 April 2024 - 05:54 PM

I have no desire to dicuss "true" color. LOL

I figured that. I thought I would be polite anyway.


Edited by loujost, 24 April 2024 - 05:55 PM.


#291 Oort Cloud

Oort Cloud

    Fly Me to the Moon

  • -----
  • Posts: 5,616
  • Joined: 19 Nov 2020
  • Loc: New Jersey, USA

Posted 24 April 2024 - 06:46 PM

The first set would not be able to take pictures of the sodium tail on some comets or mercury, the second would. In this sense it’s not even an accuracy issue, it simply can’t even capture it because of that hole between filter bands. The second set would have weak response to Ha, Nii, Sii, and some other less common lines, but it would still see it.


If the intended purpose of these filters is astronomy, you'd think they would have extended the red coverage some to better "see" at a minimum, H-alpha.

I still want a set, but I'd probably go with Astronomik, as I'm sure they're a fraction of what Chroma charges.

#292 badgie

badgie

    Viking 1

  • *****
  • Posts: 643
  • Joined: 13 Jan 2021
  • Loc: SF Area, California

Posted 25 April 2024 - 08:29 AM

We can approach this problem directly as the challenge of mapping a unit interval into a set of x numbers using n basis functions.  Typically, those numbers are called R, G, B and n = 3.  I see two competing objectives:

1: Capturing as many photons as possible

2: Providing the maximum separation between (in information theory sense) between nearby points on the unit interval.

3??  The ability to capture information in the context of strong backgrounds (e.g. capture OIII with a strong Ha background)

 

Any other useful criteria?

 

This is similar to BQ Octanis's analysis except the hue analysis includes the response of the eye (If I understand correctly) as opposed to information content. 



#293 loujost

loujost

    Mariner 2

  • -----
  • topic starter
  • Posts: 243
  • Joined: 02 Oct 2017

Posted 25 April 2024 - 08:49 AM

We can approach this problem directly as the challenge of mapping a unit interval into a set of x numbers using n basis functions.  Typically, those numbers are called R, G, B and n = 3.  I see two competing objectives:

1: Capturing as many photons as possible

2: Providing the maximum separation between (in information theory sense) between nearby points on the unit interval.

3??  The ability to capture information in the context of strong backgrounds (e.g. capture OIII with a strong Ha background)

 

Any other useful criteria?

 

This is similar to BQ Octanis's analysis except the hue analysis includes the response of the eye (If I understand correctly) as opposed to information content. 

In fact my title for this thread referred to diversity in the information-theoretic sense. This could even be quantified, comparing the diversity of the signal to the diversity of the output. That's the basis of my own scientific work, in biology. I explain the connection between information (Shannon entropy) and diversity (they are not exactly equivalent) in articles such as

https://www.research...diversity_Oikos

https://www.research...Beta_Components




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics