Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Accurate colour planet images - the final report?

  • Please log in to reply
46 replies to this topic

#1 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 11 September 2020 - 10:47 PM

Hopefully this will be my last posting on this subject. I have slogged my way through the process for colour calibration of my monitor, using Registax's auto-balance feature, using G2V stars as colour calibration targets, measuring the colour coordinates of standard Macbeth colour charts, finding the "best" values for the AS224MC camera for correct white balance settings, measuring the effect of planet elevation on colour cast and using measured spectral data as measures of "colour truth" for calibrating image data.

 

While the last link pointed to a plot showing measured albedos (reflectances) of the planets as a function of wavelength, it wasn't until recently that I was able to source the original paper by Erich Karkoschka in which he publishes the actual data from his experiments (available here) and also the colour tri-stimulus values for the full disc albedo measurements of Jupiter, Saturn, Uranus and Neptune (plus a few moons and Saturn's ring). Unfortunately the format Karkoschka used to describe the colours in the CIE 1931 format of (albedo, spectral purity, dominant wavelength) is rarely used today, and this required more than a little investigation to find the method of converting these values to the more common XYZ, Yxy and (eventually) RGB formats. 

 

Karkoschka's calculated tri-stimulus values for the planets are shown below, together with the corresponding RGB values in D65 colour space shown as the first set of RGB values. Also shown are the RGB values I calculated by feeding the measured reflectance data from Karkoschka into an Excel spreadsheet published here, showing good correlation between the RGB values calculated using the two methods.

 

                 Albedo      Purity      Wavelength     R       G      B               R        G        B
Jupiter       0.50           0.10         572              190   188   172           190     187     170
Saturn       0.48           0.21         577              198   183   153           196     182     151
Uranus       0.50           0.12         490              155   195   198           162     197     200
Neptune     0.43           0.19         485              138   182   199           140     185     200

 

So therefore, we can assume that the values shown above should represent the average full disc colour coordinates for the planets (at least at the ESO in 1993 when the measurements were taken). The next question is; what does this look like in reality?

 

To answer this question, I took some of my own data and used Photoshop to measure the median R, G and B values of the full disc after processing. I found that what I thought were a set of pleasing images to the eye were in the main quite different from the values shown above, especially for Uranus and Neptune. Rather than trying to match the actual R, G and B values, I instead used Photoshop to apply a linear shift to the colour data of the images so that the Red/Green and Blue/Green ratios in the images matched the data from Karkoschka. This method is shown in the first image below, this may not be the correct way of doing it, if there is a better way then please let me know.

 

So without any further ado, here are the "colour corrected" images for (in order) Jupiter (shown 100% captured size), Saturn (150% captured size), Uranus (150% captured size) and Neptune (200% captured size). I am still working on Mars, I have some data but it's incomplete below 500nm.

 

Comments welcome smile.gif

 

Andrew

Attached Thumbnails

  • Jupiter linear blue colour shift 2.jpg
  • 2020-07-15-1356_1-L-Jup_AS_F5000_l6_ap106_Driz30 Jup925Kiwi r1g1b11 ps2sm100 colcorr.png
  • 2020-09-05-1147_4-L-Jup_AS_F5000_l6_ap125_Driz30 Jup925Kiwi r1g1b11 ps2 colcorr.png
  • 2020-09-05-1206_5-L-Sat_AS_F5000_l6_ap114_Driz30 Sat925LitLes r1g1b11 ps2sm150 colcorr.png
  • 2020-08-28-1905_8-L-Uranus_AS_F3000_l6_ap1_Driz30 Ura925FC-A r1g1b1 ps2sm150 colcorr.png
  • 2020-06-08-1936_2-L-Neptune_AS_P33_l6_ap4_Driz30 NepFC-C r1g1b1 ps1sm200 colcorr.png

Edited by Tulloch, 12 September 2020 - 12:30 AM.

  • Magellanico, JMP, Kenny V. and 7 others like this

#2 Kokatha man

Kokatha man

    Voyager 1

  • *****
  • Posts: 14,971
  • Joined: 13 Sep 2009
  • Loc: "cooker-ta man" downunda...

Posted 12 September 2020 - 01:19 AM

...I find it impossible to believe this will be your "final report" Andrew..!!! bigshock.gif  rofl2.gif


  • RedLionNJ, AstroEthan and Tulloch like this

#3 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 01:40 AM

...I find it impossible to believe this will be your "final report" Andrew..!!! bigshock.gif  rofl2.gif

... maybe you are correct (note that I did put a question mark in the heading in anticipation lol.gif).

 

I took the Mars data from Meadows and interpolated/extended the missing data between 380 - 495 nm using the formula shown in the first plot below (which may or may not be accurate, or even close) to generate the RGB values shown below using the same techniques as before

 

             R        G       B

Mars     118     88      54

 

Performing the same colour shift trick in Photoshop on some of my recent Mars images produces the following images, which look more like the colour cast that everyone else produces for Mars. Images shown at 150% captured size.

 

While these colours for Mars might be closer to reality, it still doesn't explain to me why the colour corrections applied to one planet fail to produce the "correct" result for another. It's all a bit confusing really confused1.gif .

 

Andrew

Attached Thumbnails

  • Mars albedo from Meadows.JPG
  • 2020-08-28-1848_8-L-Mars_AS_F5000_l6_ap30_Driz30 Mar925FC-D r1g1b11 ps2sm150 colcorr.png
  • 2020-09-05-1548_5-L-Mars_AS_F5000_l6_ap118_Driz30 Mars925FC-D r1g1b11 ps1sm150 colcorr.png

  • JMP and happylimpet like this

#4 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 02:06 AM

Andrew, I do hope it is not your last post on color!  And also, it will not be my last post to play devil's advocate!  As a scientist, I appreciate your desire for the truth.  At the same time, however, I believe that your quest for "truth" in this case will be impossible, because "true color" is completely artifactual to human color vision, which itself is highly variable.  Not only is there wide variation in color perception among individuals, but display variations are also problematic.  The greatest color processing engine in the world is the human brain, and context will change color perception, even in a single individual.  It is noteworthy that even professional agencies that produce "true color" images usually have disclaimers that the result is purely for aesthetics, rather than science.  For example, I have referenced before a NASA resource that provides a map of the Moon in "true color", whatever that means.  However, the fine print says the following:

 

"This image is optimized for aesthetics, not science. Scientific applications should use the source data."

 

And if you go to the source data, you will find the monochrome, narrow band data, that lacks all relevance to human color perception.  

 

And furthermore, even spacecraft images that are taken in the absence of Earth's atmosphere have to undergo extensive calibration adjustments before they are suitable for photometric calculations.  It's not trivial.  The reasons why one color correction scheme you have devised for one planet may not be suitable for another are unclear, but there are many possibilities.  The sensor itself would need to have calibration measurements performed, because CMOS sensors are not perfectly linear.  Then, you would have to adjust for atmosphere, including the elevation above horizon.  But also important would be the phase angle between the planet, Earth, and the Sun, to normalize for the angle of incidence of irradiation and observation (although changes are small for distant planets, but still measurable).  And on top of all that, the processing software that you use to alter the color channels plays a role in the outcome, because not all adjustments are performed equivalently in different software (whether Photoshop, Registax, RawTherapee, PixInsight, etc).  The variables are quite extensive, and beyond the scope of amateur equipment, and the end result is subject to personal interpretation, which varies widely by individual.

 

At the end of the day, even if you devised a method that you think is "correct", what is its value if others perceive the colors to be wrong, given that "color" itself is subjective?  What is the definition of "right"?


  • RedLionNJ and The_8_Bit_Zombie like this

#5 BQ Octantis

BQ Octantis

    Soyuz

  • *****
  • Posts: 3,788
  • Joined: 29 Apr 2017
  • Loc: Red Centre, Oz

Posted 12 September 2020 - 03:02 AM



Hopefully this will be my last posting on this subject.

NOOOOOOO! bawling.gif

 

My interest is more on colors of DSOs, and I've wasted countless hours figuring out a practical way of getting good color "accuracy" out of my 600D at the extreme low end of the histogram. Ever the pragmatist, I've found the logarithmic scale in RawTherapee to be most useful to get a histogram match between channels at the low end. And I used Landon Noll's table of RGB values by star class to get the high end as accurate as possible with measured RGB values (note that G2V stars are not actually white). What I found was that in linear color space, the red scalar of the 600D in Daylight WB (the only repeatable white balance for DSOs) is too high—I have to reduce the slope by about 25%. I can then truncate the bottom of the red channel to get rid of sky glow, residual camera thermal noise, or what have you. I couldn't tell you why. It just works.

 

More significantly, much like the mainstream advice in the planetary forum is to just get a planetary camera to solve all my ills, the advice in the DSLR forum is to get an Ha mod to solve all my ills. But I like my simple tools, and there's no reason a simple workflow can't boost reds to the maximum. My best results for all colors (low and high) to date would probably be my most recent shot of the Lagoon and Trifid Nebulas region—which an Astrobinner even questioned whether the camera was modded. cool.gif

 

get.jpg?insecure

 

I owe a huge debt of gratitude to Roger Clark and Sharkmelly—science minded individuals like you who tested the fringes of capability to discover the limits and their probable causes. So keep it up!

 



"true color" is completely artifactual to human color vision, which itself is highly variable

 

+1

I actually have a slight color shift between my right and left eyes. My left casts a slight reddish hue, while my right has a slight cyan hue. Eyecrazy.gif

 

BQ


Edited by BQ Octantis, 12 September 2020 - 03:41 AM.

  • happylimpet likes this

#6 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 03:27 AM

Thanks Tom, what is the definition of "right"? The CIE 1931 colour space is the definition of "right". How you or I perceive the colour of an object doesn't matter, because if the spectrum is measured accurately, with the various corrections made for phase angle, atmospheric effects, telluric effects (something I only just learnt about from Karkoschka's paper) and everything else, the calculations to convert the spectrum to XYZ, xyY or RGB will be accurate. Just because you see a colour as pink and I see it as purple makes no difference, as that is only our perception, and if we are told that a particular spectrum relates to the colour "purple", then that's fine. If I am told that the the colour of Jupiter is an array of pixels, each with with an exact XYZ/RGB value, then I should be able to re-create that if I had the correct equipment and processing skills. How it looks to me is inconsequential, as I will know it's accurate, as will you, even if you see it as a completely different colour. 

 

Now, the work of Karkoschka (which is well worth a read, click the link labelled "document" for the full text, "data" for the spectra, or here if you still have access to an online library) was only the average spectra over the full disc at a few moments in time rather than a full hyperspectral image of the planet with individual spectra for each pixel which would be much preferred (but impossible at the time). With improvements to imaging spectrophotometers we may be able to produce these datasets one day.

 

On elevation, my measurement on Saturn showed that once it is above around 40 degrees the colour shift is pretty minor, just a few percent either way, barely enough to notice. There may well be non-linear effects in the sensor, but if I use the same sensor/camera each time, it should be the same each time. I compared the RGB values of the sharpened image of Saturn with the pre-sharpened version, and the difference in red/green and blue/green ratios was around 1%, well within the error bounds for the measurement. 

 

Whether I will be able to produce scientifically accurate images is probably doubtful, however I would like to get as close as possible. Using the measured spectra from a reliable source as a target to aim for, goes a long way to approach this lofty goal smile.gif.

 

Andrew


Edited by Tulloch, 12 September 2020 - 03:58 AM.

  • happylimpet likes this

#7 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 03:39 AM

Andrew, I think the fact that professional agencies that specialize in astro imaging specifically state that "true color" is aesthetic and not scientific speaks volumes.  But good luck in your endeavor!  



#8 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 04:00 AM

Andrew, I think the fact that professional agencies that specialize in astro imaging specifically state that "true color" is aesthetic and not scientific speaks volumes.  But good luck in your endeavor!  

Thanks - I think they only publish image that "look good" is so they can continue to be funded. I suspect if they actually showed the planets as bland as they really are, people might think it's not worth the money ...lol.gif


  • happylimpet likes this

#9 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 04:14 AM

Thanks - I think they only publish image that "look good" is so they can continue to be funded. I suspect if they actually showed the planets as bland as they really are, people might think it's not worth the money ...lol.gif

You're not far off from the truth here.  "Real" data is often very bland, and only certain processing techniques are allowed for publication.  Much of what is done in amateur photography circles (including on this forum) would be strictly forbidden.  "True color" is never considered, and is only offered as an afterthought for public outreach, because it doesn't exist, and is not captured in the raw data.  



#10 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 04:19 AM

Hopefully this will be my last posting on this subject. 

 

NOOOOOOO! bawling.gif

Yep, I don't think I have the energy any more, not sure if there's anything else to try. Unless someone measures the planets or NASA launch a new satellite with an imaging spectrophotometer which can scan the entire disc, I don't know where else to go.

 

Yes, Sharkmelley was a great help to me when I had hit the wall trying to find white balance corrections for the ASI224MC, there are a lot of scientists and engineers in this caper (probably unsurprisingly) willing to help and/or play devil's advocate lol.gif to push you along the path.


Edited by Tulloch, 12 September 2020 - 04:20 AM.


#11 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 04:30 AM

"Real" data is often very bland, and only certain processing techniques are allowed for publication. Much of what is done in amateur photography circles (including on this forum) would be strictly forbidden.  

 

"True color" is never considered, and is only offered as an afterthought for public outreach, because it doesn't exist, and is not captured in the raw data.  

Absolutely understand that only properly sanctioned techniques would be permitted in a scientific publication

 

I would highly recommend you have a look at the Karsoschka paper, it's as close to correct data as I've seen and he did go to the effort of converting the spectra to "true colour" tri-stimulus values. For some reason he also converted the spectra to (UBV) coordinates normally reserved for classifying stars, maybe just from habit?



#12 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 04:42 AM

Absolutely understand that only properly sanctioned techniques would be permitted in a scientific publication

 

I would highly recommend you have a look at the Karsoschka paper, it's as close to correct data as I've seen and he did go to the effort of converting the spectra to "true colour" tri-stimulus values. For some reason he also converted the spectra to (UBV) coordinates normally reserved for classifying stars, maybe just from habit?

I'll take a look at it Andrew.  Your threads have had many good and informative discussions in them. I started out viewing things a bit more like you, thinking that everything should be capable of being standardized in some repeatable way.  But there are so many variables, that it really seems unlikely.  Even aside from color, many of the problems we have been discussing also apply to luminance values, and so photometric analysis of amateur images for the most part is highly inaccurate.  In the end, I feel that most processing decisions become highly subjective (assuming we're working within the bounds of commonly accepted methods!)  It also gets to the heart of the difference between photography and science.  In science, an aesthetic image is always nice, but not required to demonstrate an important point.  In photography, an aesthetic image is essential.  Astrophotography becomes tricky, because people try to do both.....although sometimes the scientific accuracy in the images may be less than initially perceived.  But it's fun to think about.  



#13 yock1960

yock1960

    Fly Me to the Moon

  • *****
  • Posts: 5,065
  • Joined: 22 Jun 2008
  • Loc: (Crossroad of clouds) Ohio, USA

Posted 12 September 2020 - 05:27 AM

Are you going to start a thread for color balance in DSO's next? lol.gif popcorn.gif

Just kidding! grin.gif  

 

My aim is reasonable accuracy and pretty...both subjectively decided by me! lol.gif

 

That's one good thing about false color images though...but it doesn't make them any easier to achieve the subjective goals.

 

Steve 



#14 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 05:32 AM

You're most likely correct in all of this Tom, and I appreciate your responses, far better than getting no feedback at all. What started as a simple question about how to balance the red and blue values led to this rabbit hole.

 

If only there was a giant 18% grey card in the sky ...


  • happylimpet likes this

#15 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 05:44 AM

Are you going to start a thread for color balance in DSO's next? lol.gif popcorn.gif

Just kidding! grin.gif  

 

My aim is reasonable accuracy and pretty...both subjectively decided by me! lol.gif

 

That's one good thing about false color images though...but it doesn't make them any easier to achieve the subjective goals.

 

Steve 

Heh heh, one of the main reasons I got out of nebula imaging (I still like the idea of imaging galaxies) was the complete lack of any standard method of creating a colour correct image. What is the "correct" colour of something you can't see anyway? That, and not having the correct equipment or dark enough skies. But mostly for the colour ambiguity. 



#16 BQ Octantis

BQ Octantis

    Soyuz

  • *****
  • Posts: 3,788
  • Joined: 29 Apr 2017
  • Loc: Red Centre, Oz

Posted 12 September 2020 - 06:12 AM

Heh heh, one of the main reasons I got out of nebula imaging (I still like the idea of imaging galaxies) was the complete lack of any standard method of creating a colour correct image. What is the "correct" colour of something you can't see anyway? That, and not having the correct equipment or dark enough skies. But mostly for the colour ambiguity. 

It's just beauty that sells, not perfect accuracy. To date, I've produced 4 astrophotography books of my targets over the outback—and they were all immediately claimed by family and colleagues. And of 20 pages, I dedicated just one to planets.

 

BQ


Edited by BQ Octantis, 12 September 2020 - 06:24 AM.

  • Tulloch likes this

#17 dcaponeii

dcaponeii

    Viking 1

  • -----
  • Posts: 595
  • Joined: 01 Sep 2019

Posted 12 September 2020 - 07:12 AM

All I can say is you're not done until you to the ASI290MC camera and probably the new one too  ASI482MC??  My memory is bad.

 

EDIT:  ASI462MC  (I looked it up cause it was driving me nuts.)


Edited by dcaponeii, 12 September 2020 - 07:13 AM.


#18 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 12:31 PM

If only there was a giant 18% grey card in the sky ...

This wouldn't help much!  18% gray card is still subjective for "neutral", and there is debate over whether "neutral" should be anywhere from 12%-18% gray.  Also relevant is that reflected light (of average scenes on Earth) is actually less than 18% in almost all cases, more like 12-13%.  This makes the Moon essentially a really nice gray card in the sky, but it doesn't help solve color debates much!  Also, where would your gray card be located?  It would need to be directly next to each planet.  But even still, this would only help white balance an image, but as we learned in one of your other threads, white balancing is only part of the issue, and a color correction matrix is used by all consumer cameras to "correctly" map colors in the image.  A correctly white balanced image just means that neutral colors will appear neutral, but doesn't mean that non-neutral colors are all accurately displayed.  


Edited by Tom Glenn, 12 September 2020 - 12:31 PM.


#19 RedLionNJ

RedLionNJ

    Skylab

  • *****
  • Posts: 4,126
  • Joined: 29 Dec 2009
  • Loc: Red Lion, NJ, USA

Posted 12 September 2020 - 12:44 PM

While this discussion (about the Nth one on this particular topic, over the years) has been interesting, it does very little to promote a resolution to the practical problems:

 

I capture data and manipulate that into an image file.

 

1.  How closely can I make it resemble what I see visually (through an eyepiece), colour-wise?

2. This is on one display device. Can I make it look the same on my laptop AND on my phone?

3. What's it going to look like on the billions of other devices in the world? I can try to produce something which looks good on mine, but I can't control my own tablet or phone, never mind anybody else's monitor or phone :(

 

Step 1 alone is substantial in magnitude. Step 2 is likely unsurmountable. Step 3 is futile.


  • happylimpet likes this

#20 JMP

JMP

    Viking 1

  • *****
  • Posts: 995
  • Joined: 31 Oct 2005

Posted 12 September 2020 - 12:47 PM

Thanks, Andrew. I appreciate this effort. I like the idea of producing images that resemble what I can see in the eyepiece, hitting the auto white balance button doesn't come close!



#21 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 05:35 PM

All I can say is you're not done until you to the ASI290MC camera and probably the new one too  ASI482MC??  My memory is bad.

 

EDIT:  ASI462MC  (I looked it up cause it was driving me nuts.)

This is more about producing a "gold standard" by which all the different cameras can be judged. The Colour Checker chart is an example of what professional photographers use to produce "correct" colours in their scenes, as the lighting source varies widely. The camera doesn't matter, as as it can be calibrated to this standard.

 

This wouldn't help much!  18% gray card is still subjective for "neutral", and there is debate over whether "neutral" should be anywhere from 12%-18% gray.  Also relevant is that reflected light (of average scenes on Earth) is actually less than 18% in almost all cases, more like 12-13%.  This makes the Moon essentially a really nice gray card in the sky, but it doesn't help solve color debates much!  Also, where would your gray card be located?  It would need to be directly next to each planet.  But even still, this would only help white balance an image, but as we learned in one of your other threads, white balancing is only part of the issue, and a color correction matrix is used by all consumer cameras to "correctly" map colors in the image.  A correctly white balanced image just means that neutral colors will appear neutral, but doesn't mean that non-neutral colors are all accurately displayed.  

I reckon that if there was a nice big grey card in space (OK, so maybe a few at certain key distances lol.gif), uniformly illuminated (by the Sun) then this could be used to create the colour correction matrix for the camera (as I slowly and painstakingly attempted previously for the ASI224MC) and thereby use the same techniques as a grey card for normal photography.

 

While this discussion (about the Nth one on this particular topic, over the years) has been interesting, it does very little to promote a resolution to the practical problems:

 

I capture data and manipulate that into an image file.

 

1.  How closely can I make it resemble what I see visually (through an eyepiece), colour-wise?

2. This is on one display device. Can I make it look the same on my laptop AND on my phone?

3. What's it going to look like on the billions of other devices in the world? I can try to produce something which looks good on mine, but I can't control my own tablet or phone, never mind anybody else's monitor or phone frown.gif

 

Step 1 alone is substantial in magnitude. Step 2 is likely unsurmountable. Step 3 is futile.

Thanks Grant, I guess what I'm trying to do is separate the perception of what the planets look like to you or me, from the actual quantifiable reflectance (or albedo) over the visible spectrum. There is a one-to-one transformation of the reflectance spectrum to "colour", and whether they are displayed on a computer screen or a phone in RGB, or on a piece of paper in CMYK doesn't matter. It doesn't even matter if we perceive colours differently, we can both look at the squares on a Colour chart and perceive them differently, but they still have a defined colour value. 

 

To your specific questions:

1. None of the images I create with my camera even remotely resemble what I perceive through the eyepiece. In another thread (where I discuss purchasing a new eyepiece for better views of the planets), BQ uploaded some of his raw video footage which looks a lot like what I see through the eyepiece. Since he uses the eyepiece projection method to create his images, this is highly relevant. If I were to recreate this appearance on my images I think I know what the response would be :).

 

2. The only way I know to recreate the same appearance on multiple devices is to use a dedicated display calibration device (like this one), not sure if they make them for phones though lol.gif. My iPhone tends to oversaturate and blow out the contrast, I assume this is deliberate to get good reviews in electronics magazines, but I certainly wouldn't trust it for colour accuracy.

 

3. Does it really matter what your images look like on devices that you cannot control? I would find satisfaction in knowing that I produced the best, most accurate data I could with the equipment I had - if someone else thinks my colours are wrong, or not saturated enough etc, then I would be willing to change if they could show proof as to why.

 

And then of course, the ultimate way to display your images in their "true" form would be to produce your images in book form, something like this, which I purchased before buying my first telescope.

https://www.google.c...AJ?hl=en&gbpv=0

 

It purports to show images of the planets in "true colour", and some of the images of Jupiter and Saturn as taken by Cassini in "true colour" are shockingly different from what we normally perceive as "correct". In the book the image displayed of Uranus has almost exactly the same colour cast as what I showed above, Neptune is a little more deep blue than mine.

 

Andrew


  • RedLionNJ and dcaponeii like this

#22 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 05:57 PM

Andrew, at the risk of going round in circles here, your goal of representing "actual quantifiable reflectance (or albedo) over the visible spectrum" (your quote from above) is going to be impossible to record with the cameras we use.  The bandpass of the filters used, whether the Bayer matrix filters if a color camera, or the individual filters if mono, will affect the result because of the difference in overlap of wavelengths between the filters.  And human vision has even more substantial overlap in the sensitivities of cone cells in the retina, which is why the concept of "true" color is artifactual to begin with.  With proper equipment, you could in fact create a spectrum for any object, but this wouldn't help clarify any of the questions we face for producing a color image.  Also worth noting, because you mention color checker charts, is that the vast majority of professional photographs, such as landscapes, etc, do not represent "true" color that has been checked with any charts, but are rather artistic interpretations that make a pretty picture (as judged by either the photographer, or their intended audiences).  



#23 Tulloch

Tulloch

    Vanguard

  • *****
  • topic starter
  • Posts: 2,149
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 12 September 2020 - 07:44 PM

Sure Tom, it's not my intention to produce absolute reflectance values with these cameras (which as you say is impossible), rather to use the results from instruments that can measure them as a standard to aim for - hence my use of average red/green and blue/green ratios as a means of supplying some sort of measure for comparison purposes. It's quite a crude metric I understand, but it's something I can use easily to get close to the correct colour balance without spending too much time in post processing.

 

As BQ said, beauty sells, not perfect accuracy, so it's not surprising that professional photographers artificially increase saturation or boost particular colours at the expense of accuracy to sell images. But they (usually) know their equipment well enough to know how it responds to different settings to produce a pleasing result, in the same way that NASA post ultra-saturated images of the planets taken by Hubble which the general public oooh and ahhh over ohlord.gif  fingertap.gif.

 

There seems to be common misunderstanding of what I'm trying to achieve here - I'm not interested in making these images look the same to everyone on every device, because that is impossible. The cones and rods in every person's eyes are unique, as are the methods used to process this data in everyone's brain. I cannot explain to anyone else the colour of an object through my eyes as it is unique to my experience, as it would be to you. What I can do is say is the the object reflects light across the visible spectrum to produce the values shown on a graph, for instance the data from Karsoschka shown below.

 

Given this reflectance data, it is then possible to convert these values into an effective XYZ, xyY or sRGB for a particular whitepoint (here defined as D65) to produce the images shown above. I might see them as quite green looking, you might see them as yellow, but that doesn't matter to me in the slightest. If (and I really mean "If" as I have no idea especially for Mars a good proportion of the blue wavelengths are made up) the images shown above have the "correct" colour balance in D65 colour space, then on a properly calibrated monitor, or as a paper copy, or even as a 3D model (in a room with a properly calibrated light source) then the result should be the same and should provide a representation of what the planet actually looks like in real life.

 

Maybe...lol.gif

 

Andrew

Attached Thumbnails

  • Albedo of gas planets.JPG

Edited by Tulloch, 12 September 2020 - 08:07 PM.


#24 Tom Glenn

Tom Glenn

    Mercury-Atlas

  • -----
  • Posts: 2,956
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 12 September 2020 - 08:04 PM

 then the result should be the same and should provide a representation of what the planet actually looks like in real life.

 

Maybe...lol.gif

 

 

Define "real life".  Is it how the planet looks through the eyepiece?  If so, what magnification, and what aperture scope?  Is it how the planet would look if you were viewing from Earth orbit outside the influence of atmosphere?  Or is it how the planet would look if you were viewing from a spacecraft in the vicinity of the planet?  If so, how close are you to the planet, and what phase angle are you viewing with reference to the planet and Sun?



#25 BQ Octantis

BQ Octantis

    Soyuz

  • *****
  • Posts: 3,788
  • Joined: 29 Apr 2017
  • Loc: Red Centre, Oz

Posted 12 September 2020 - 08:12 PM

Wait, are we after accurate measurement or accurate representation?

 

Let's follow the formation of an image—independent of the human eyeball:

 

  1. There are photons departing the sun across the electromagnetic spectrum.
  2. The photons pass through the interplanetary dust. Some are reflected. Some absorbed. All differently by wavelength.
  3. They reach and interact with a planet's atmosphere. Some are absorbed, some are reflected. All differently by wavelength. In the case of Mars, they reach the surface. Some are absorbed, some are reflected. And once again they pass through the planet's atmosphere. Once they clear the atmosphere, this is the ground truth wavefront of photons.
  4. They once again pass through the interplanetary dust. Some are reflected. Some are absorbed. All differently by wavelength.
  5. They reach the Earth's atmosphere. They pass through nitrogen, oxygen, carbon dioxide, dust, water vapor, clouds, smoke, smog, etc. Some are absorbed. Some are reflected. Many are scattered. All differently by wavelength. And the wavefront across the light column of every photon wavelength is uniquely distorted by the angle of incidence plus the integral of the optical turbulence along the light path.
  6. They reach an aperture, along with photons originating more locally. The aperture converts the incoming rays of the light column into a two-dimensional spectrogram. The aperture's Airy disk continuum combined with the wavefront error mangle the rays uniquely by wavelength.
  7. If the optic is glass, it brings the spectrogram into focus on separate planes by wavelength along the light path.
  8. For an RGB sensor, the photons pass through a Bayer filter matrix that splits up the spectrogram into a raster onto a silicon substrate covered with photosites beneath each filter. Each Bayer filter only allows a range of photon wavelengths through.
  9. The photons in each filter range interact with the substrate and produce electron-hole pairs and thereby a voltage.
  10. Heat in the substrate also releases electron-hole pairs that are added to the voltage.
  11. The sensor electronics count the electrons (voltage) at each photosite and report a value based on a voltage-to-value conversion table. The process of measuring the voltage introduces error in the measurement.
  12. The values are saved by the electronics to a file by Bayer square.
  13. A computer converts the Bayer data into Red, Green, and Blue values (0-255) organized by pixel.
  14. The computer also multiplies the linear scalar RGB values by an exponential gamma function to produce a representation more comparable to the retinal response of a human eyeball.

 

So what are we trying to perfect the accuracy of in this flow? It seems like most of the effort is on Step 11. But the real problem lies in the completely random nature of Steps 4 & especially 5. In producing an image at Step 14, we're demangling everything between Step 3 and Step 14 to make Step 14 look like it was taken at Step 3. Independent of the eyeball and screen, we're just trying to make the histograms at Step 3 and Step 14 match at every point the rasterized spectrogram.

 

Even with a perfect camera, what we detect, record, and represent isn't very palatable to the eyeball. (This is even more of an issue for deep space imaging—what is ultimately displayed is in no way reflective of what was actually recorded!) So instead, we manipulate the histogram at Step 14 to match the histogram from data sampled at Step 3 by remote probe. The probe data skips Steps 4 & 5 and makes the effect of Step 6 much smaller than from where we sit. In the end, this may look good, but by doing so we've completely lost the actual measurement accuracy.

 

BQ


Edited by BQ Octantis, 12 September 2020 - 08:17 PM.



CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics