Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Planetary imaging and Colour Accuracy

  • Please log in to reply
62 replies to this topic

#1 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 14 March 2020 - 10:47 PM

Part 1 smile.gif

 

Hi all, some of you may be aware that I am interested (some may say obsessed) in creating planetary images with as accurate a colour rendition as possible. Maybe it's my inner scientist showing up, all the personality tests say I have a preference for getting the details right while ignoring the big picture.

 

I've worked into this over a period of time, firstly by calibrating my monitor using a Spyder monitor checker to get my screen to show colours accurately. Next I looked into Registax's "auto RGB balance" feature which seems to do a reasonable job of white balancing imaging with a range of colours in them (eg Jupiter and Saturn) but is no good for single colour planets like Uranus, Neptune and to a lesser extent Mars. I then looked into using a G2V star as a calibration target when imaging planets with mixed success (well at least, the jury is still out on whether this technique is valid or not). 

 

The real difficulty with these techniques is a lack of a good standard to test the accuracy of the camera and processing procedures, which is why I have now taken it to the next step of imaging a gretagmacbeth 24 panel ColorChecker card with my two cameras, the Canon 700D and ASI224MC. These cards have accurately known L*a*b* values which can be converted to RGB using various algorithms. For these tests I followed the advice from here, imaging the checker card under lighting conditions approximating D50 using a couple of techniques, imaging through a lens and imaging through the scope. The lens tests are much easier to perform, setting up the scope outside and asking your assistant to hold the checker card still enough around 100m away while you attempt to focus and get the histogram right is, well, a whole new level of difficulty lol.gif. My hope was that by taking an image of the ColorChecker card I would be able to make some simple changes in Registax, PhotoShop etc to get some parameters for colour balancing that I could use as a default value. 

 

Firstly I took images with my Canon 700D with the "Daylight" white balance setting and "Faithful" colour rendition, imaging in both raw and large jpg files. The auto exposure setting was a little high, so I reduced the exposure by 1 stop and placed the actual "true" colours as small patches over the top of the real image. The first image below shows that the DSLR is able to produce reasonably accurate colour balance straight out of the box, with only slight changes between the captured and "true" result (the middle box shows the "true" colours"). The top section shows the image of the chart using a Canon 28-105mm lens, the bottom image shows the chart through my C9.25" telescope in prime focus mode. No colour modification was made on these images.

 

Next, I tried the same technique using my ZWO ASI224MC planetary camera. I connected the fisheye lens for the close images, and imaged the card through the C9.25" at the same time as the DSLR. I tried to fill the histogram as high as possible given the wide range of colours and shades on the chart. As imaged, the colours from the chart were significantly different from those expected. I played with these images in a number of different ways to try and get the colours closer, from using the auto-RGB setting in Registax (which usually does well) to using the "Remove colour cast" (auto white balance) setting in PhotoShop, and adjusting the saturation/levels settings to try an get something close. As can be seen in the second image below, nothing seemed to work very well.

 

To be continued...

 

Andrew

 

Firecapture settings for the 224MC were:

 

FireCapture v2.6  Settings
------------------------------------
Camera=ZWO ASI224MC
Filter=L
Profile=Test
Filename=2020-03-07-0432_2-L-Test.avi
Date=20200307
Start=043157.519
Mid=043212.423
End=043227.327
Start(UT)=043157.519
Mid(UT)=043212.423
End(UT)=043227.327
Duration=29.808s
Date_format=yyyyMMdd
Time_format=HHmmss
LT=UT
Frames captured=4000
File type=AVI
Extended AVI mode=true
Compressed AVI=false
Binning=no
ROI=1304x976
ROI(Offset)=0x0
FPS (avg.)=134
Shutter=0.100ms
Gain=37 (6%)
AutoGain=off
HardwareBin=off
FPS=100 (off)
WBlue=95 (off)
Brightness=1 (off)
USBTraffic=90 (off)
SoftwareGain=10 (off)
HighSpeed=on
AutoExposure=off
WRed=52 (off)
AutoHisto=75 (off)
Gamma=50 (off)
Histogramm(min)=0
Histogramm(max)=230
Histogramm=90%
Noise(avg.deviation)=0.78
Limit=4000 Frames
Sensor temperature=44.2°C

Attached Thumbnails

  • Canon 700D colour check through lens and OTA small.jpg
  • ZWO ASI224MC with fisheye lens colour comparisons small.jpg
  • ZWO ASI224MC colour check through OTA small.jpg

  • sharkmelley, Ethan Chappel and aeroman4907 like this

#2 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 14 March 2020 - 11:14 PM

Part 2 lol.gif .

 

OK, so simple colour shifts aren't going to fix the colour balance on the ASI224MC, so I started looking into developing a program that could take the raw colours from the camera, finding an adjustment curve to correct the shift, and then applying the shift to the data to get the correct colours. For this, I measured the RGB values of each of the panels in PS and used Excel to find a curve of best fit for each colour. This seemed to be going well for the red and green components, however the blue component didn't seem to be converging as well as the other colours. I created a program in VB to read in the raw image, apply the correction in each colour component and save the result. The result of this was a very similar image to the previous best result, but the colours still didn't match accurately enough.

 

I then looked more closely at the "true" colour vs measured values on a graph. Shown on the figures below, you can see that (generally) the red and green curves show a one-to-one correspondence for real vs measured, however the blue curve shows a point of inflection on the curve. This appears to say that it is actually impossible to construct a correction factor for converting the measured result back to the "real" result, as a one-to-many relationship exists, at least at low values of blue intensity.

 

to be continued...

 

Andrew

Attached Thumbnails

  • ColourChecker ASI224MC test transformation.jpg
  • red curve.JPG
  • green curve.JPG
  • blue curve.JPG

  • Ethan Chappel and Foc like this

#3 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 14 March 2020 - 11:31 PM

Part 3 shocked.gif

 

I then thought it might be a factor of the Wblue "magnification" setting in Firecapture that was possibly causing this issue. The skies were clear, so went outside again and set up the ASI224MC with the fisheye lens and ColorChecker chart and took a number of videos with different values of Wblue, from 50 to 99. After recording (by hand) all the RGB values for the panels with Wblue values at 50, 60, 70, 80, 95 and 99, all I ended up with was a series of curves all shifted by an amount equal to the Wblue setting (see below).

 

While this may have been a waste of time, one thing that did come out was that the various Wblue figures increased the blue intensity linearly while leaving the red and green values untouched. In fact, over all 6 panels, the red and green values only changed by +/- 1 unit, showing remarkable consistency. The rgb values for the grey panels are shown below as Wblue changed, and it is seen that as Wblue increases it gets closer to the green level until finally at a value of Wblue =99 it is basically the same value. For a pure grey colour, all RGB components should be the same. This indicates that the Wred value should be increased from 52 to around 68 (assuming the same linear response) so that the grey panels should be truly "grey". 

 

Unfortunately the clouds are covering the sun at the moment so I cannot test this right now, stay tuned for part 4 lol.gif.

 

Andrew

Attached Thumbnails

  • Wblue settings.JPG
  • Grey RGB values.JPG

Edited by Tulloch, 14 March 2020 - 11:32 PM.

  • Ethan Chappel likes this

#4 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 14 March 2020 - 11:33 PM

Andrew, it is probably impossible to do what you are attempting to do with a color camera.  The specific parameters of the individual color filters in the bayer matrix and the methods of debayering are likely making this impossible to control.  It would be interesting to perform such a test with a monochrome camera and filters in which you could control the channel mixing.  And even then, by the time you get the data in a format you can observe on your computer, a gamma curve has already been applied to it.  Much of the corrections you are searching for would probably have to be applied when the data is still linear.  

 

But then there's the "big picture" question of what's the point, aside from intellectual interest?  Even if you were to perfectly calibrate a camera and filter set to faithfully reproduce colors as perceived on a sunny day (which looks to be impossible), this would only give you the colors of the planet, as perceived under your conditions, fully susceptible to elevation, air pollution, smoke, etc.  This comes back full circle to what I thought you were trying to prevent when you were searching for a standardized color (and your G2V star method).  Which means that perhaps the old "make it look nice" method is not so bad after all?



#5 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 8,051
  • Joined: 29 Apr 2017
  • Loc: Northern Virginia, USA

Posted 15 March 2020 - 12:31 AM

What's really gonna blow your mind is this:

 

The light transmittance function of the atmosphere is dependent on

 

Elevation angle (a.k.a. altitude)

Time of night

Water content

Temperature

Dust

Pollutants

Clouds

Light Pollution

 

So even if you get your camera calibration perfect, you're still going to have to deal with these issues. My simple solution is to take the histogram of my image and match it to the one from my favorite rendition of a planet. For Jupiter, it involves the Cassini color palette and the width of my pinky and thumb nails for the white point and falloff of the histogram peak, respectively…

 

BQ

 

P.S. Here's my pet peeve: The rings of Saturn are not white.

 

P.P.S. If you'd only experienced the taunts I've taken from the DSO experts on trying to do color calibration in non-linear color space!


Edited by BQ Octantis, 15 March 2020 - 12:46 AM.

  • Tulloch and WWilliams1977 like this

#6 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 15 March 2020 - 12:48 AM

In a discussion in a previous thread with Steve (Aeroman), I had guessed that if a test such as this one that Andrew has done were performed, that the ASI224 (or any color astronomy camera) would give significantly different colors than terrestrial cameras.  My hunch was that the terrestrial camera would produce much more accurate colors, as least when photographing a color chart during the daytime in sunlight.  My reasoning is that consumer cameras, whether DSLR or mirrorless, have decades of color science built into them.  The "raw" files that come from these cameras have been considerably processed by the internal processor of the camera.  If you read much photography forums, there are constantly people arguing about which camera maker (usually Nikon, Canon, and Sony, but also some others) produces the most pleasing colors out of the camera.  It's not a trivial matter.  


  • Tulloch likes this

#7 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 15 March 2020 - 12:55 AM

Andrew, it is probably impossible to do what you are attempting to do with a color camera.  The specific parameters of the individual color filters in the bayer matrix and the methods of debayering are likely making this impossible to control.  It would be interesting to perform such a test with a monochrome camera and filters in which you could control the channel mixing.  And even then, by the time you get the data in a format you can observe on your computer, a gamma curve has already been applied to it.  Much of the corrections you are searching for would probably have to be applied when the data is still linear.  

 

But then there's the "big picture" question of what's the point, aside from intellectual interest?  Even if you were to perfectly calibrate a camera and filter set to faithfully reproduce colors as perceived on a sunny day (which looks to be impossible), this would only give you the colors of the planet, as perceived under your conditions, fully susceptible to elevation, air pollution, smoke, etc.  This comes back full circle to what I thought you were trying to prevent when you were searching for a standardized color (and your G2V star method).  Which means that perhaps the old "make it look nice" method is not so bad after all?

Hi Tom, thanks for your comments. I get it, really I do. but what I'd really like to do is get a consistent colour balance on my ZWO camera. The fact that the Canon DSLR is able to produce a colour set using the "daylight" WB setting that is really very close to the "true" colours from the ColorChecker card shows that it really should be possible to do this. Obviously Canon have been doing this for a lot longer than me, but what they have been able to achieve is very good, why not for ZWO cameras? Even for mono cameras, you still have to mix the colours somehow, what do you use as a reference?

 

I still haven't given up on the G2V star idea, and I've also been doing some investigations into scattering of the spectrum at different wavelengths as a function of altitude angle, but that's not ready for further discussion just yet... lol.gif



#8 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 15 March 2020 - 12:59 AM

Andrew, I understand (I think!) what you are trying to do.  #1 there seems to be genuine interest in the subject at hand, and with that I can totally agree.  In fact, there need not be any resolution to this matter for us to have benefited from the discussion.  #2 you are trying to get consistent color balance.  I think this is impossible with the method you have outlined here though.  Your G2V method is probably better suited for that.  As to why can your Canon do a better job, that is essentially my point.  The ZWO cameras are just sensors housed within aluminum bodies.  A camera like your Canon, or my Nikon, has the sensor embedded within a small computer, using proprietary software developed over many years by their engineers.  It's not surprising that the color balance, at least for terrestrial shots, is much better.  


  • Tulloch likes this

#9 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 15 March 2020 - 12:59 AM

What's really gonna blow your mind is this:

 

The light transmittance function of the atmosphere is dependent on

 

Elevation angle (a.k.a. altitude)

Time of night

Water content

Temperature

Dust

Pollutants

Clouds

Light Pollution

 

So even if you get your camera calibration perfect, you're still going to have to deal with these issues. My simple solution is to take the histogram of my image and match it to the one from my favorite rendition of a planet. For Jupiter, it involves the Cassini color palette and the width of my pinky and thumb nails for the white point and falloff of the histogram peak, respectively…

 

BQ

 

P.S. Here's my pet peeve: The rings of Saturn are not white.

 

P.P.S. If you'd only experienced the taunts I've taken from the DSO experts on trying to do color calibration in non-linear color space!

Yep, that's why I was so interested in the G2V reference star method as a way of removing all those effects. However, if the colour balance of the camera is wrong to start with, it's all moot anyway!

 

Andrew



#10 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 15 March 2020 - 01:08 AM

Andrew, I understand (I think!) what you are trying to do.  #1 there seems to be genuine interest in the subject at hand, and with that I can totally agree.  In fact, there need not be any resolution to this matter for us to have benefited from the discussion.  #2 you are trying to get consistent color balance.  I think this is impossible with the method you have outlined here though.  Your G2V method is probably better suited for that.  As to why can your Canon do a better job, that is essentially my point.  The ZWO cameras are just sensors housed within aluminum bodies.  A camera like your Canon, or my Nikon, has the sensor embedded within a small computer, using proprietary software developed over many years by their engineers.  It's not surprising that the color balance, at least for terrestrial shots, is much better.  

Hmmm, you're right, the DSLR gets to process the raw data - isn't it possible for FC to output the raw sensor data? Is that the FIT format?



#11 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 8,051
  • Joined: 29 Apr 2017
  • Loc: Northern Virginia, USA

Posted 15 March 2020 - 01:32 AM

Yep, that's why I was so interested in the G2V reference star method as a way of removing all those effects. However, if the colour balance of the camera is wrong to start with, it's all moot anyway!

 

Andrew

The G2V will help with finding a singular white point ([255,255,255]—but only relevant to the part of the sky near the planet), but the trouble with the non-linear data is that there is no way to subtract a linear atmospheric correction value across a histogram of brightness levels that have been scaled nonlinearly (with a gamma typically of 2.2-2.5). Sharkmelley (one of my best DSO critics) recently posted a set of test files to illustrate what happens:

 

https://www.cloudyni...dge-test-files/

 

I've been experimenting on linear data with using any HIP star for color calibration (with Stellarium and Landon Noll's handy color table for all star classes), but the CA out of my camera lenses makes the measurement of the starting star color dubious…

 

BQ

 

P.S. A non-linear subtraction based on pixel value would be quite simple to implement in Matlab, but I'm far to cheap and lazy to go down that rabbit hole! But if you create a cross-platform Photoshop plugin that does this, I'd be happy to pay for it…


Edited by BQ Octantis, 15 March 2020 - 02:00 AM.

  • Tulloch likes this

#12 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 15 March 2020 - 01:33 AM

Hmmm, you're right, the DSLR gets to process the raw data - isn't it possible for FC to output the raw sensor data? Is that the FIT format?

I'm not sure.  I just know that the "raw" files that come from my Nikon look like they have had considerable processing done to them by the time I see them. Obviously, whatever raw converter you use (I use Adobe Camera Raw) has default settings that debayer and apply a gamma curve, but there is also some stuff done "in camera".  Just as an obvious example, my Nikon will clip to black in the raw file when I'm shooting the Moon, whereas this never happens with my ASI183mm.  Who knows what happens to the color information in camera, but clearly something does, because if you compare raw files from Nikon, Canon, and Sony you can see differences.  And then certainly if you take a jpeg from the camera you will see even more processing (and actually, for terrestrial photography, the jpegs produced in camera are pretty darn good most of the time!).  Just anecdotally, whenever I take a photo of the Moon with my Nikon, the colors look "true" in the sense that they are accurate for an observer at my location, such as if you were to look through the eyepiece.  However, the same cannot be said for my ASI224, back when I used to use this on the Moon.  There was always an inaccurate yellow or greenish cast to the image.  R6 would balance this out just fine, but as I've stated before, the color balance produced by R6 is not really "correct" either.  And on it goes...



#13 Ethan Chappel

Ethan Chappel

    Vanguard

  • *****
  • Posts: 2,485
  • Joined: 16 Jun 2014
  • Loc: San Antonio area, TX, U.S.A.

Posted 15 March 2020 - 01:51 AM

Honestly, I think the most practical solution may be to observe the planets at transit through an eyepiece and adjust an existing RGB image until it looks closest to what you see through the scope. The adjusted image could be used as a reference in the future.

 

It would be interesting to see how different everyone's results are. Maybe it's an experiment we could all try in a few months even though different monitor calibrations might be an issue.



#14 Kokatha man

Kokatha man

    James Webb Space Telescope

  • *****
  • Posts: 17,687
  • Joined: 13 Sep 2009
  • Loc: "cooker-ta man" downunda...

Posted 15 March 2020 - 03:32 AM

Honestly, I think the most practical solution may be to observe the planets at transit through an eyepiece and adjust an existing RGB image until it looks closest to what you see through the scope. The adjusted image could be used as a reference in the future.

 

It would be interesting to see how different everyone's results are. Maybe it's an experiment we could all try in a few months even though different monitor calibrations might be an issue.

Well, that would be relevant for you from your perspective Ethan & a reasonable stance...but I suspect that we might have all our threads locked as we argue interminably over just who saw what & whether they adjusted correspondingly: I'll accept Mark's "chocolate-sprinkled custard" comments re Jove, although if he reads this I want him to confirm that he was referring to warm custard where the chocolate melts & changes appearance slightly...& how much egg was in the custard apropos the yellow-ness of said..! :rofl: 


  • gfstallin and Lacaille like this

#15 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 8,051
  • Joined: 29 Apr 2017
  • Loc: Northern Virginia, USA

Posted 15 March 2020 - 06:56 AM

The cool thing about your plots is that you can input them into Photoshop directly with the Curves function for each channel. For a degree 2 polynomial, you just need the zero value, the 255 value and another point…

 

BQ


Edited by BQ Octantis, 15 March 2020 - 07:19 AM.


#16 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 15 March 2020 - 08:18 AM

The G2V will help with finding a singular white point ([255,255,255]—but only relevant to the part of the sky near the planet), but the trouble with the non-linear data is that there is no way to subtract a linear atmospheric correction value across a histogram of brightness levels that have been scaled nonlinearly (with a gamma typically of 2.2-2.5). Sharkmelley (one of my best DSO critics) recently posted a set of test files to illustrate what happens:

 

https://www.cloudyni...dge-test-files/

 

I've been experimenting on linear data with using any HIP star for color calibration (with Stellarium and Landon Noll's handy color table for all star classes), but the CA out of my camera lenses makes the measurement of the starting star color dubious…

 

BQ

 

P.S. A non-linear subtraction based on pixel value would be quite simple to implement in Matlab, but I'm far to cheap and lazy to go down that rabbit hole! But if you create a cross-platform Photoshop plugin that does this, I'd be happy to pay for it…

Thanks BQ, I'll have a look at Sharkmelley's post, thanks for the heads-up.

 

I don't think the G2V star will give me a 255,255,255 white point, but hopefully a neutral grey... Any WB point is a good starting point.

 

The non-linear modification was written in VB (my only language), but it's far too rudimentary to share smile.gif .

 

I'm not sure.  I just know that the "raw" files that come from my Nikon look like they have had considerable processing done to them by the time I see them. Obviously, whatever raw converter you use (I use Adobe Camera Raw) has default settings that debayer and apply a gamma curve, but there is also some stuff done "in camera".  Just as an obvious example, my Nikon will clip to black in the raw file when I'm shooting the Moon, whereas this never happens with my ASI183mm.  Who knows what happens to the color information in camera, but clearly something does, because if you compare raw files from Nikon, Canon, and Sony you can see differences.  And then certainly if you take a jpeg from the camera you will see even more processing (and actually, for terrestrial photography, the jpegs produced in camera are pretty darn good most of the time!).  Just anecdotally, whenever I take a photo of the Moon with my Nikon, the colors look "true" in the sense that they are accurate for an observer at my location, such as if you were to look through the eyepiece.  However, the same cannot be said for my ASI224, back when I used to use this on the Moon.  There was always an inaccurate yellow or greenish cast to the image.  R6 would balance this out just fine, but as I've stated before, the color balance produced by R6 is not really "correct" either.  And on it goes...

I think that even if nothing else, getting Wred and Wblue values that produce neutral grey values on the colour chart would be helpful. Of course, going through the atmosphere may change all that. I do note that Christophe Pellier recommends values of Wred = 60 to 70 and Wblue = 99 for the ASI224MC.

 

The cool thing about your plots is that you can input them into Photoshop directly with the Curves function for each channel. For a degree 2 polynomial, you just need the zero value, the 255 value and another point…

 

BQ

Hmmm, I hadn't thought of that. Unfortunately the curves as I showed them are the inverse of what I need. They show the transformation required to get the ASI224MC values given the "true" values - while it's possible to get a one-to-one correspondence for the red and green levels, unfortunately the blue curve is one-to-many and impossible to convert the other way.

 

Andrew



#17 aeroman4907

aeroman4907

    Surveyor 1

  • -----
  • Posts: 1,559
  • Joined: 23 Nov 2017
  • Loc: Castle Rock, Colorado

Posted 15 March 2020 - 09:36 AM

Hi Andrew,

 

I do applaud your efforts and I also endeavored to use a color checker to assist in my color imaging of the moon.  As Tom alluded to in this thread here https://www.cloudyni...nar-terminator/ , we were discussing color imaging of the moon earlier this week.  Obviously the moon exhibits extremely low saturated images in comparison to the planets, so it is not 100% identical to imaging the planets, but I think some parallels can be drawn.

 

I’ll let you (and others) read the referenced post at your own leisure if so desired, but I’ll summarize to say that when I calibrated my QHY 183C camera to a color chart, I got skewed color results on imaging the moon.  Further within that same thread Tom and I discussed using LRO data without the earth’s atmospheric effects on color to come up with a closer true rendition of what the moon’s colors actually are.  In a quick manner I could approximate the colors of the moon by adjusting the Temperature and Tint sliders in the Camera Raw Filter of PS.  One could say these adjustments could be roughly applied to planetary images as well.

 

I have an imaging run of Jupiter last year that incorporated the same color levels as I had imaged the moon.  The current color level settings on my QHY 183C camera were set on a night when transparency was extremely good and the moon was nearly overhead and was about 93% illuminated.  I set the levels so that the moon essentially appeared greyscale.  I have not moved those settings since.  These same settings were used for the Jupiter imaging, obviously not overhead.  I have an image I had stacked in AS!3 with no other processing whatsoever.  For my demonstration below, I did a very quick deconvolution without much care just to sharpen the image a little.  The image on the left is the colors as captured with no increase in saturation or any adjustments to color balance.  Not perfect colors, but within the ballbark of what might be expected.  The image on the right incorporates the same adjustments I made to my lunar image to make it closely match the LRO color data.  You can clearly see that image has pushed way too much to the yellow/green end of the spectrum.

 

Jupiter-Color.jpg

 

What's really gonna blow your mind is this:

 

The light transmittance function of the atmosphere is dependent on

 

Elevation angle (a.k.a. altitude)

Time of night

Water content

Temperature

Dust

Pollutants

Clouds

Light Pollution

 

So even if you get your camera calibration perfect, you're still going to have to deal with these issues. My simple solution is to take the histogram of my image and match it to the one from my favorite rendition of a planet. For Jupiter, it involves the Cassini color palette and the width of my pinky and thumb nails for the white point and falloff of the histogram peak, respectively…

 

BQ

 

P.S. Here's my pet peeve: The rings of Saturn are not white.

 

P.P.S. If you'd only experienced the taunts I've taken from the DSO experts on trying to do color calibration in non-linear color space!

 

I think with all these factors BQ mentions, I think constantly adjusting values on the camera would be required.

 

Andrew, it is probably impossible to do what you are attempting to do with a color camera.  The specific parameters of the individual color filters in the bayer matrix and the methods of debayering are likely making this impossible to control.  It would be interesting to perform such a test with a monochrome camera and filters in which you could control the channel mixing.  And even then, by the time you get the data in a format you can observe on your computer, a gamma curve has already been applied to it.  Much of the corrections you are searching for would probably have to be applied when the data is still linear.  

 

But then there's the "big picture" question of what's the point, aside from intellectual interest?  Even if you were to perfectly calibrate a camera and filter set to faithfully reproduce colors as perceived on a sunny day (which looks to be impossible), this would only give you the colors of the planet, as perceived under your conditions, fully susceptible to elevation, air pollution, smoke, etc.  This comes back full circle to what I thought you were trying to prevent when you were searching for a standardized color (and your G2V star method).  Which means that perhaps the old "make it look nice" method is not so bad after all?

I am beginning to agree with Tom's statement at the end that perhaps "make it look nice" is a good way to go.  I feel my current 'calibration' method of imaging the moon overhead on a night with good transparency and getting the image essentially greyscale is a good first step.  I can then figure out the rest of the color balance issues in post.  Just my two cents...


Edited by aeroman4907, 15 March 2020 - 09:39 AM.


#18 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 8,051
  • Joined: 29 Apr 2017
  • Loc: Northern Virginia, USA

Posted 15 March 2020 - 04:16 PM

Hmmm, I hadn't thought of that. Unfortunately the curves as I showed them are the inverse of what I need. They show the transformation required to get the ASI224MC values given the "true" values - while it's possible to get a one-to-one correspondence for the red and green levels, unfortunately the blue curve is one-to-many and impossible to convert the other way.

Try swapping the input with the output.

 

BQ



#19 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 15 March 2020 - 07:10 PM

Hi Andrew,

 

I do applaud your efforts and I also endeavored to use a color checker to assist in my color imaging of the moon.  As Tom alluded to in this thread here https://www.cloudyni...nar-terminator/ , we were discussing color imaging of the moon earlier this week.  Obviously the moon exhibits extremely low saturated images in comparison to the planets, so it is not 100% identical to imaging the planets, but I think some parallels can be drawn.

 

I’ll let you (and others) read the referenced post at your own leisure if so desired, but I’ll summarize to say that when I calibrated my QHY 183C camera to a color chart, I got skewed color results on imaging the moon.  Further within that same thread Tom and I discussed using LRO data without the earth’s atmospheric effects on color to come up with a closer true rendition of what the moon’s colors actually are.  In a quick manner I could approximate the colors of the moon by adjusting the Temperature and Tint sliders in the Camera Raw Filter of PS.  One could say these adjustments could be roughly applied to planetary images as well.

 

I have an imaging run of Jupiter last year that incorporated the same color levels as I had imaged the moon.  The current color level settings on my QHY 183C camera were set on a night when transparency was extremely good and the moon was nearly overhead and was about 93% illuminated.  I set the levels so that the moon essentially appeared greyscale.  I have not moved those settings since.  These same settings were used for the Jupiter imaging, obviously not overhead.  I have an image I had stacked in AS!3 with no other processing whatsoever.  For my demonstration below, I did a very quick deconvolution without much care just to sharpen the image a little.  The image on the left is the colors as captured with no increase in saturation or any adjustments to color balance.  Not perfect colors, but within the ballbark of what might be expected.  The image on the right incorporates the same adjustments I made to my lunar image to make it closely match the LRO color data.  You can clearly see that image has pushed way too much to the yellow/green end of the spectrum.

 

attachicon.gifJupiter-Color.jpg

 

 

I think with all these factors BQ mentions, I think constantly adjusting values on the camera would be required.

 

I am beginning to agree with Tom's statement at the end that perhaps "make it look nice" is a good way to go.  I feel my current 'calibration' method of imaging the moon overhead on a night with good transparency and getting the image essentially greyscale is a good first step.  I can then figure out the rest of the color balance issues in post.  Just my two cents...

Hi Steve, yep I read with interest your discussion on that thread, and threatened to post my own investigations rather than hijack yours lol.gif.

 

With regards to your image of Jupiter, who's to say that not the correct colour of Jupiter? We have become so used to producing images that "look nice" to us that anything with a yellow tinge looks wrong somehow. Who do we trust? Certainly not Hubble, the filters used in its camera do not mimic human colour vision. What about Cassini? I have a picture book called (unsurprisingly) "The Planets - Photographs from the archives of NASA", where Jupiter and Saturn are shown in so-called "true-color" from Cassini, and both have a very yellow/brown colour both in print and on the internet.

 

https://www.nasa.gov...a/pia04866.html

https://solarsystem....esources/17504/

 

I know BQ has a particular problem with showing the rings of Saturn as white (see his link further up), it's probably not surprising that the gas giants are coloured the way they are, after all they are big balls of hydrogen, helium and other gases like H2S, methane, ammonia etc. 

 

Are these images pretty? Not particularly.

Are they accurate? Maybe.

Are they more accurate than the ones we current assume are correct simply because they are prettier? Does anyone even care?

 

For me, I would like the white balance settings on my camera to be accurate, so at least the proportions of red, green and blue should be more or less equal on a grey target, fortunately there are six of them on them on the colour chart I tested. It appears that for correct WB on the ASI224MC, the Wblue setting should be 99, and Wred somewhere between 60 and 70 (I reckon 68 looks good, but I'll test it again next time I have a chance).

 

I still don't like the idea of "choose colours that look nice" - maybe I'm just too naive and stubborn to accept that yet (although maybe I'll mellow in my old age gramps.gif).

 

 

Try swapping the input with the output.

 

BQ

 

Sure, but given a measured blue value of (say) 50, the spread of "real" blue values that correspond to a measured value of 50 is somewhere between 20 and 120 (see image). The other colours have a more or less one-one transformation, the blue does not bawling.gif.

 

Andrew

Attached Thumbnails

  • blue values.jpg

Edited by Tulloch, 15 March 2020 - 10:21 PM.


#20 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 8,051
  • Joined: 29 Apr 2017
  • Loc: Northern Virginia, USA

Posted 16 March 2020 - 02:34 AM

Sure, but given a measured blue value of (say) 50, the spread of "real" blue values that correspond to a measured value of 50 is somewhere between 20 and 120 (see image). The other colours have a more or less one-one transformation, the blue does not bawling.gif.

Try setting the intercept so that the inverse function doesn't have two solutions on the curve.

 

(I'm trying to remember my function definition from high school calculus…"y=f(x) is a function if and only if for every x in X there is only one y…"? scratchhead2.gif ).



#21 BQ Octantis

BQ Octantis

    Cosmos

  • *****
  • Posts: 8,051
  • Joined: 29 Apr 2017
  • Loc: Northern Virginia, USA

Posted 16 March 2020 - 04:13 AM

Did you illuminate your targets correctly? RawTherapee has a page on how to properly illuminate a target to create a color response profile for your camera:

 

https://rawpedia.raw...he_Color_Target

 

"An example of a perfectly executed target shot. The whole ColorChecker Passport is visible, it is centered and takes about 1/3rd of the frame to avoid vignetting issues, it is perpendicular to the camera, there is no glare from the target as the target and the camera are positioned at a sufficient angle relative to the sun, there is no glare from the background, it was taken on an empty road far away from any reflective surfaces, and it is well exposed with the white patch being close to the right edge of the histogram without clipping (measured in RawTherapee using the neutral profile after picking the white balance off a gray patch)."

 

Crikey! And full sun is easy, too! Down in the range between bias and noise is not…

 

BQ



#22 aeroman4907

aeroman4907

    Surveyor 1

  • -----
  • Posts: 1,559
  • Joined: 23 Nov 2017
  • Loc: Castle Rock, Colorado

Posted 16 March 2020 - 07:29 AM

Hi Steve, yep I read with interest your discussion on that thread, and threatened to post my own investigations rather than hijack yours lol.gif.

 

With regards to your image of Jupiter, who's to say that not the correct colour of Jupiter? We have become so used to producing images that "look nice" to us that anything with a yellow tinge looks wrong somehow. Who do we trust? Certainly not Hubble, the filters used in its camera do not mimic human colour vision. What about Cassini? I have a picture book called (unsurprisingly) "The Planets - Photographs from the archives of NASA", where Jupiter and Saturn are shown in so-called "true-color" from Cassini, and both have a very yellow/brown colour both in print and on the internet.

 

https://www.nasa.gov...a/pia04866.html

https://solarsystem....esources/17504/

 

I know BQ has a particular problem with showing the rings of Saturn as white (see his link further up), it's probably not surprising that the gas giants are coloured the way they are, after all they are big balls of hydrogen, helium and other gases like H2S, methane, ammonia etc. 

 

Are these images pretty? Not particularly.

Are they accurate? Maybe.

Are they more accurate than the ones we current assume are correct simply because they are prettier? Does anyone even care?

 

For me, I would like the white balance settings on my camera to be accurate, so at least the proportions of red, green and blue should be more or less equal on a grey target, fortunately there are six of them on them on the colour chart I tested. It appears that for correct WB on the ASI224MC, the Wblue setting should be 99, and Wred somewhere between 60 and 70 (I reckon 68 looks good, but I'll test it again next time I have a chance).

With regard to 'true' color, I would say that Jupiter should be a little easier to determine than the moon as to what is a correct color as the saturation levels are naturally higher.  I am not a visual observer, but I believe someone quite familiar with visually observing Jupiter with an adequately sized scope should be able to determine the general colors of Jupiter so long as they are not color blind or have other visual problems.



#23 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 16 March 2020 - 07:35 AM

Thanks BQ, as mentioned above I followed the advice from this website, which gives good advice on how to mimic a D50 light source, with the sun as approximately 45 degrees altitude which was around 11am here. I should have placed the target on some black foam, but didn't end up doing this. Garden hoses were of course an optional extra lol.gif. I had to put the computer inside the big black plastic box so I could see the screen to focus.

 

The fact that the Canon 700D colours came out so close to the true colours gave me a good feeling that the technique was valid, but the ASI224MC colour scheme was significantly different.

 

Andrew

Attached Thumbnails

  • IMG_6557 small.JPG

Edited by Tulloch, 16 March 2020 - 07:35 AM.

  • aeroman4907 likes this

#24 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • Posts: 6,949
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 22 March 2020 - 09:33 PM

When your DSLR creates an in-camera JPG or when you open a DSLR raw file in Photoshop/ACR or RawTherapee the basic sequence of operations performed is the following:

  • Subtract bias
  • Debayer
  • Apply white balance
  • Apply the camera specific Colour Correction Matrix (CCM)
  • Apply the gamma for the working colour space (typically sRGB or AdobeRGB)

When processing your ASI224MC images you are probably omitting the last 2 steps and that's why you are not matching the colour chart colours.

 

For some background reading, to familiarise yourself with this stuff, there's a useful series of articles that starts here:  https://www.strollsw...nversion-steps/

One of the articles in that series explains how to calculate the necessary forward matrix:   https://www.strollsw...d-color-matrix/

 

For your Canon 700D the daylight white balance and matrix from camera's colour space to the sRGB colour space is here:  https://www.dxomark....---Measurements

You need to click on the "Color response" tab.

 

Unfortunately you will not find the relevant matrix for the  ASI224MC published anywhere so you will need to calculate it yourself.

 

I strongly suggest you learn how to manually "develop" a Canon 700D raw image before you move on to attempt the same thing with the ASI224C.  Once you've learned how to do it, you can process both planetary images and images of deep sky objects using the same technique, to produce images in natural colour.

 

Also, you might be quite surprised what Jupiter looks like in natural colour:   https://www.nasa.gov...m-natural-color

It's much less contrasty than the typical amateur image of Jupiter.

 

Mark


Edited by sharkmelley, 22 March 2020 - 10:15 PM.

  • Foc, Tom Glenn and Tulloch like this

#25 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 23 March 2020 - 01:58 AM

Mark, that's a good post, and some very interesting reading provided in the links.  It illustrates nicely just how much work is done to a "raw" image by programs such as ACR by the time that you see the image and can start editing.  I wonder, however, what lessons can be learned here and applied to planetary imaging, as performed by amateurs.  We take videos consisting of many thousands of frames, and use the program Autostakkert to align and stack the frames into a "raw" stack.  This stacked image is a 16 bit tif file, and has already been converted to RGB color space with gamma applied.  At no point in time do we (the users) handle any linear data during the process of converting the raw video file into a stack.  And from reading your post, it appears that most of these corrections would have to take place before the application of gamma and conversion into a color space for editing.  I wonder then, where would these interventions take place?  Presumably the data exists as linear somewhere in the pipeline, such as in the raw video files, which are usually either .avi or .ser format?  The actual stacking procedure is a black box to most users, although Rolf's recent thread about his new program does pull back the curtain considerably.  But in practice, we simply input a video file into the software and get a tif output file to begin editing.  The strange colors we see in these cameras no doubt reflects the lack of the extra calibration steps, but I don't know where you would insert them back into the workflow easily.  


Edited by Tom Glenn, 23 March 2020 - 02:01 AM.

  • Tulloch likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics