Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

color calibration

  • Please log in to reply
45 replies to this topic

#1 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 18 October 2020 - 09:43 PM

I'm using a monochrome camera with RGB filters and I've been trying to come up with a way to have proper color balance without just "guessing". I normally just expose to have all three histograms at about 60%, then tell AutoStakkert to normalize everything to 75%. That's obviously nowhere near correct, so I have to color balance manually. I went back to my terrestrial photography roots and figured out a white balancing technique. I wanted to see if anyone had any comments/suggestions about it.

 

In summary:

1. Capture RGB data from a white light source

2. Do some math to get the histogram peaks aligned

3. Apply the same correction to real planetary data

 

I pointed my camera at a softbox with a good white light, then did some captures through all the filters. Since I was pointed at a white source, the data should be a neutral white/gray (identical histograms for each channel). To make the math easier later, I kept the gain the same and varied the shutter speed. I tried to roughly align the histograms. Using all that data, I calculated the correction factors to have all the histogram peaks in the same place. I found that my manually chosen shutter speeds had the histograms off by a few percent.

 

For my QHY 178M with ZWO LRGB filters, I found that red was the least sensitive (required longest exposure to get the same histogram). So I normalized green and blue to red. The correction factors I got were about [1.0, 0.30, 0.45]. e.g. to get the same histogram for a 100ms exposure for red, I would need 30ms for green and 45ms for blue.

 

With red being so far off from green and blue, it didn't feel right. However, after looking at the transmission profiles for the filters and the sensitivity profile of the camera, it seems sane. I applied the corrections to some old Mars and Jupiter data and those also look pretty good. Though, I did have to correct for gain differences in the real data. And since it seems to have worked out, I assume I did so correctly.

 

Any thoughts on this? I'd be especially interested if someone else has tried it.


  • Kiwi Paul likes this

#2 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 18 October 2020 - 10:36 PM

Andrew (Tulloch) had a series of extensive investigations of this.  The first and last of these posts are below, but Andrew will be able to fill you in on any additional details or better links.

 

https://www.cloudyni...olour-accuracy/

https://www.cloudyni...e-final-report/

 

The main problem is that for the most part, we never seem to be able to get away from a final step in which we have to eyeball it, or resort to "just guessing", as you phrased it.  It's surprisingly difficult to even agree on what the correct color of a planet should be, let alone come up with a system to properly calibrate a camera to do it.  NASA data is narrowband, and does not match up with the bandpasses for any of our filters.  Also, different atmospheric conditions that occur during each imaging session make consistency visually impossible.  

 

In general, when using monochrome sensors and RGB filters, the exposure should be done to maximize the quality of the data, and has nothing to do with color transmission.  This means you would expose each filter to the same histogram value in the raw recording, and then balance the colors manually during processing.  You would not intentionally underexpose one color channel during the capture, because this would unnecessarily reduce the quality of that data.  


  • Kiwi Paul likes this

#3 Kevin Thurman

Kevin Thurman

    Ranger 4

  • *****
  • Posts: 388
  • Joined: 12 Jul 2020

Posted 18 October 2020 - 11:37 PM

"proper" color balance, as I see it, is whatever looks the best, or the most natural. Even if the color calibration done by PI is close, I always end up making small adjustments by eye as part of final touches.



#4 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 19 October 2020 - 03:53 AM

While I appreciate Tom's recommendation, I don't have any experience with using monochrome cameras and filters. However, if you can lay your hands on a colour checker chart and photograph it (as I did in the first link on Tom's post), that should get you close. Alternatively photographing a colourful stripy shirt could be useful, anything that you could use as a standard would be helpful.

 

While a lot of NASA images are created using narrowband filters (Hubble especially), some of the spacecraft we sent to the planets have full spectrum filters in them (for instance the Cassini spacecraft that visited Jupiter and Saturn, see Fig 20 of this reference). 

https://www.research...000000/download

 

Cassini produced images that have quite a distinctive colour cast to them which may not match the "traditional" colours for these planets, but might well be closer to the truth, had you been sitting on its back.

https://www.nasa.gov...a/pia04866.html

https://solarsystem....10-images-2016/

 

Andrew


Edited by Tulloch, 19 October 2020 - 03:54 AM.

  • Kiwi Paul likes this

#5 NightOwl07

NightOwl07

    Explorer 1

  • *****
  • Posts: 68
  • Joined: 12 Jul 2013
  • Loc: Calgary

Posted 19 October 2020 - 12:02 PM

I have thought about this as well recently and it looks like Tulloch's last post beat me to it in a way (way smarter people out there doing this for way longer than I have).

 

As I see it, there are two things to calibrate, the camera and the light source.

 

What the OP has done with is a calibration of the camera. Picking the right calibration bulb would be important. I am personally thinking something around 5900K to match as closely with sunlight as possible would be ideal. However I have no way to verify that this is truly the output so it'll have to be taken on faith. This is made worse because my eyes are insensitive in the blue-green region, what looks white to me would look blue to others.

 

Step 1  was to take exposures of all 3 filters under the same settings then compare histogram ranges to come up with their relative ratios. This would represent bias from the imaging train itself and should be constant.

Step 2 would be similar to Andrew's last post using the planets' respective albedo in each wavelength. Instead of using existing data I was thinking of taking a calibration set where the exposure and gain settings are identical between the colour channels. After this is done the ratio for imaging train correction can be applied to the "object" calibration frames to correct for the imaging train. The relative histogram ratio that result after this can then be calculated and used as a final correction factor for histogram adjustment in the actual dataset to account for different planets' natural colour dominance.

 

This will likely vary slightly from night to night. Different position of the planet, different atmospheric conditions may scatter different wavelengths to a different degree. It would be most representative of the colour balance as seen from that specific location at that time, not necessarily true to the colour of the planets. I am not sure how much variation this will impart but if enough of this calibration data can be gathered across different conditions though one should be able to get pretty close to the actual signal.

 

I am completely just thinking here, but it is something I tend to try next time I get some good weather. If anyone has any thoughts on this process it'd be very much appreciated.


Edited by NightOwl07, 19 October 2020 - 12:06 PM.


#6 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 19 October 2020 - 05:47 PM

Christophe Pellier has been measuring the spectra of the planets for some time, his website could also be helpful to you.

https://www.planetar...imaging.com/en/

 

I also measured the red/green and blue/green ratios of Saturn at a range of elevation angles (30 - 73 degrees) which shows the effects of the atmosphere when imaging the planets.

https://www.cloudyni...rees-elevation/

 

Andrew

 



#7 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 19 October 2020 - 08:39 PM

Thanks for all the comments.

My primary goal is to get the image to look the same as what someone would see through the viewfinder (or in this case, the eyepiece). After I get some sort of standardized output, I think it'll be a good stepping stone to anything more advanced should I choose to go there. I assume it'll be easier to make other corrections if the inputs to those steps have the same starting point.

The light I used as my white balance target is a 5500K CFL. It came with my softbox that's meant as photography studio lighting, so I assume it's an appropriate source. I ordered a color calibration target, but I'm probably going to have to use actual sunlight for that because of the working distances I'll need. I've seen recommendations from photography sites saying that a "daylight" colored sun is when it's at about 45deg, so I'll probably go with that.

#8 Kiwi Paul

Kiwi Paul

    Vanguard

  • -----
  • Posts: 2,314
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 19 October 2020 - 11:10 PM

I have followed this discussion with interest. Andrew and perhaps a few others may recall I have a problem with the distorted colours of the planets due to the achromatic doublet of my 8 inch refractor. I have been thinking about solutions for this problem and feel that there must be a way to map correct colours onto the distorted colours from the telescope. I was playing around with GIMP and followed instructions from a YouTube video about transferring colour shades from one picture to another and successfully did so. So my idea is to take a well balanced spectrum from ‘white’ sunlight and use it as a reference to correct an image from the telescope if I also use the telescope to image the same white source. Can you see anything wrong with this idea? It seems to me it is allied to the discussion above?? I have tried to secure such an image with the telescope but had my target way too close and couldn’t get enough of it in the field of view! Am still trying!
Cheers Paul

#9 Kiwi Paul

Kiwi Paul

    Vanguard

  • -----
  • Posts: 2,314
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 19 October 2020 - 11:13 PM

...instead of using the phrase ‘colour shades’ a better term is colour palette.
Paul

#10 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 20 October 2020 - 12:28 AM

Thanks for all the comments.

The light I used as my white balance target is a 5500K CFL. It came with my softbox that's meant as photography studio lighting, so I assume it's an appropriate source. I ordered a color calibration target, but I'm probably going to have to use actual sunlight for that because of the working distances I'll need. I've seen recommendations from photography sites saying that a "daylight" colored sun is when it's at about 45deg, so I'll probably go with that.

Yep, that's exactly how I did it. You can see my setup here

https://www.cloudyni...acy/?p=10046396

 

Andrew



#11 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 20 October 2020 - 09:27 AM

Andrew and perhaps a few others may recall I have a problem with the distorted colours of the planets due to the achromatic doublet of my 8 inch refractor. I have been thinking about solutions for this problem and feel that there must be a way to map correct colours onto the distorted colours from the telescope.


I'm not sure I fully understand the issue, but if it really is a color palette shift, then it makes sense that you could correct a white target then apply the same correction to other images.

However, it sort of sounds like chromatic aberration. I've never looked into the mechanics of how to correct it, but I've seen it as a feature in image processing software.

#12 Kiwi Paul

Kiwi Paul

    Vanguard

  • -----
  • Posts: 2,314
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 20 October 2020 - 01:52 PM

Yes, the problem is chromatic aberration. So I thought that if the lens distorts colours in certain ways then one ought to be able to remap the correct colours to the distorted ones since I assume they will reoccur each time in the same way for the same colours??
Cheers Paul
  • Jkaiser3000 likes this

#13 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 20 October 2020 - 07:21 PM

Yes, the problem is chromatic aberration. So I thought that if the lens distorts colours in certain ways then one ought to be able to remap the correct colours to the distorted ones since I assume they will reoccur each time in the same way for the same colours??
Cheers Paul

My only experience with chromatic aberration is in terrestrial photography. Even then, my lenses aren't that bad, so I just ignore it. It mostly presents itself as color fringing on edges and general "blurriness". Since astrophotography targets are effectively point sources, I don't really have a feel for how it manifests.

 

You might want to look into Lensfun. It's an open source library/database for correcting lens artifacts. One of the things it can correct is chromatic aberration. There should be guides for profiling lenses. That might give you some ideas.



#14 Kiwi Paul

Kiwi Paul

    Vanguard

  • -----
  • Posts: 2,314
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 21 October 2020 - 12:44 AM

My only experience with chromatic aberration is in terrestrial photography. Even then, my lenses aren't that bad, so I just ignore it. It mostly presents itself as color fringing on edges and general "blurriness". Since astrophotography targets are effectively point sources, I don't really have a feel for how it manifests.

 

You might want to look into Lensfun. It's an open source library/database for correcting lens artifacts. One of the things it can correct is chromatic aberration. There should be guides for profiling lenses. That might give you some ideas.

Many thanks for the reference.

Paul



#15 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 23 October 2020 - 08:19 PM

Inspired by Andrew's work as well as some articles I found, I generated an ICC profile for my camera/filters.

 

https://ninedegreesb...ra-profile.html
http://nic.ucsf.edu/...ration-results/
http://www.argyllcms.../Scenarios.html

 

I'm happy with the results, but I'm not sure if it's worth the effort. I haven't figured out how to easily squeeze it into my current workflow. I need to think about it a little more. Maintaining proper color management after applying the correction profile is really tricky. But it's really easy to do a white balance right at the end of the workflow (before any non-linear adjustments, that is). So I might just try to extract white balance info from my profile and work with that.

 

1. Stack individual channels
2. Apply exposure compensation to each channel so that it matches the color profile (or just capture the original data with correct exposures)
3. Combine into single RGB image
4. Apply color profile
5. Sharpen
6. Lower white point to the edge of the histogram
7. Find the edge of the target's limb and use the area just outside of it to set the black point (I used WinJUPOS to determine where the edge should be)
8. Enjoy your new planetary image that didn't have any subjective adjustments color during processing

 

I still need to figure out where to fit derotation fit into that flow. I usually sharpen the individual channels before combining them then derotate after combining.

Attached Thumbnails

  • mosaic.jpg

Edited by sciguy125, 23 October 2020 - 08:20 PM.

  • Tulloch likes this

#16 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 23 October 2020 - 10:16 PM

Hi there, this looks really interesting.

 

I had a look at your references, but couldn't work out a couple of things. How exactly did you create your ICC profile? What are you using as your 18% grey card for white balance?

 

Andrew



#17 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 24 October 2020 - 12:30 AM

I got an XRite ColorChecker Passport to use as my color target. I brought it outside to use the sun for illumination and imaged it in the usual way: ~60% histogram for each channel, captured 2s for each channel, stacked 10 frames in AutoStakkert, then put them together into a single RGB file. Of course, I had to use my C6 with a focal reducer to get a reasonable working distance. Using a different scope and a focal reducer instead of a barlow shouldn't matter for this though. Stacking shouldn't be needed either. I just wanted to generate the image using the same workflow.

 

After I had the tif, I treated it the same as one that would come out of a RAW developer and fed it into Argyll to generate the ICC profile.

 

One big issue is that each channel had a different exposure. A color camera captures all three channels at the same time, so you have identical exposures for all of them. However, changing the exposures of a given channel changes the color balance. So the generated profile is only valid for the same exposure ratios. My color target was captured at about 1ms for red, 0.5 for green, and 0.6 for blue. So my profile is only valid for 1:0.5:0.6 exposures. I'm not going to change my protocol from the usual 60% histogram for each channel, so I'll have to correct them during processing before applying the profile.



#18 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 24 October 2020 - 01:33 AM

Thanks for that, I'll have to do some more reading on how to use Argyll with my photos - I assume I just feed it the images I took of the colour checker chart using the "scanin" and "colprof" utilities? It's been a while since I've used a command line interface, have to brush up on my old skills :)

 

Andrew



#19 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 24 October 2020 - 01:02 PM

I assume I just feed it the images I took of the colour checker chart using the "scanin" and "colprof" utilities?

That's correct. scanin reads the chart and generates a file with the measured color of all the patches. The resulting file gets fed into colprof to generate the actual profile.


  • Tulloch likes this

#20 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 28 October 2020 - 01:39 AM

OK, so I found a Windows based program that uses Argyll and walks you through the process (which is rather nice) called CoCa.

http://www.dohm.com.au/coca/

 

I was able to use it on one of my colour chart images (with max histogram 92 for the white patch), and generated an icm file. I then took this into GIMP and applied the colour profile which worked pretty well. 

 

First image below shows the uncorrected stacked image on the left, while the same image with the colour profile applied is on the right. The RHS image also shows the expected colours of the patches if the icm profiling was "perfect". It's not a bad comparison, not as good as my best one, but at least it's easy to apply. The graph of r,g,b values of the transformed colours vs the expected ones is shown in the graph on the right, showing good correlation for r,g,b values above 50 or so.

 

So far, so good, but the kicker comes when trying to apply this profile to my images of the planets. As shown by the second image, the brightness of Jupiter on this occasion is blown out, probably since the EV of the two images (planet and colour checker chart) is not identical and I wasn't able to set the "white balance" exposure for the planet with an 18% grey card in the same way that the colour checker chart is. Even using the AS!3 tool "Normalise" to reduce the maximum exposure to 20% of maximum was not enough to correct this imbalance (see third image below).

 

Now I could just play with the levels command in Photoshop/GIMP to make the images look "normal", but that kinda defeats the whole purpose of the exercise.

 

Any tips on how to proceed here? SciGuy? Tom? SharkMelley?

 

Andrew 

Attached Thumbnails

  • 2020-03-26-0332_5-L-Test_surface_AS_P50_l6_ap242 ps_convert rotated comparison with profile and graph.png
  • 2020-09-28-1047_4-L-Jup_normalise-off_AS_F5000_l6_ap69_Driz30 convertcolourprofile ps1sm100.png
  • 2020-09-28-1047_4-L-Jup_normalise-20_AS_F5000_l6_ap69_Driz30 convertcolourprofile ps1sm100.png

Edited by Tulloch, 28 October 2020 - 01:41 AM.

  • Kiwi Paul likes this

#21 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 28 October 2020 - 03:43 AM

Andrew, at some point I'll have to go back, reread all your other posts, and then start attempting this myself, because right now I don't have enough first hand experience to troubleshoot this.  Other than to say, from all of our previous conversations, this problem always ends up being more complicated than it initially appeared!  In your Jupiter image, when you say it was "blown out", do you just mean that overall it looks bright?  Because it doesn't appear that anything is clipped.  It looks to me very similar to how the planetary images appear when I've opened raw images in either RawTherapee or Photoshop while using a custom setting that sets gamma=1, which means no gamma correction, but displays the image "correctly" by not allowing the monitor display gamma to inappropriately darken it.  So in this way, I think your image looks "correct".  I talked about this in my post months ago about gamma, but when planetary images are displayed, they have not been gamma transformed, yet the monitor thinks they have, so they get darkened.  However, in almost all cases, this result looks good, because the raw exposure does look quite washed out.  But who's to say what's correct?  What strikes me about your image is that Juiter has low contrast, which I actually think is correct!  Many amateur planetary images display abnormal contrast, far beyond what's really there.  This is true even in versions that look quite nice (and not over processed per se), and I think that to some degree, people have become accustomed to seeing images like that, and are not aware of the slightly more bland reality.  But you can always change the gamma in processing to achieve the look you want.  I was under the impression that you were interested in accurate colors, and so if you are happy with the color balance, then altering the final display gamma (or a slight tone curve to add some contrast) shouldn't deviate too much from that, would it?


Edited by Tom Glenn, 28 October 2020 - 03:45 AM.

  • sharkmelley, Tulloch and Kiwi Paul like this

#22 Tulloch

Tulloch

    Fly Me to the Moon

  • *****
  • Posts: 6,367
  • Joined: 02 Mar 2019
  • Loc: Melbourne, Australia

Posted 28 October 2020 - 06:38 AM

Thanks Tom, you're probably right, I'm still very much of a novice when it comes to applying colour profiles and the whole linear vs gamma contrast thing. Especially when the sRGB gamma profile is automatically applied on a linear image for the computer display without changing the rgb values of the original, I don't know if the gamma is applied or not, or what is going on confused1.gif .

 

I tried using RawTherapee, but it's even more of a mystery to me than Gimp. I don't know if I have applied the correction properly, or if I need to somehow take a calibration image of a grey card, or what that even looks like. Even so, I took the image above, applied a gamma correction of 0.45 to undo the gamma-2.2 from the display, then sharpened and stretched it a bit to give the images below. The resulting colour cast is certainly ... different scratchhead2.gif

 

Andrew

 

Attached Thumbnails

  • 2020-09-28-1047_4-L-Jup_normalise-off_AS_F5000_l6_ap69_Driz30 convertcolourprofile ps1sm100 gamma0p45 sharpened comparison.png

  • Kiwi Paul likes this

#23 Tom Glenn

Tom Glenn

    Soyuz

  • -----
  • Posts: 3,923
  • Joined: 07 Feb 2018
  • Loc: San Diego, CA

Posted 28 October 2020 - 05:50 PM

Thanks Tom, you're probably right, I'm still very much of a novice when it comes to applying colour profiles and the whole linear vs gamma contrast thing. Especially when the sRGB gamma profile is automatically applied on a linear image for the computer display without changing the rgb values of the original, I don't know if the gamma is applied or not, or what is going on confused1.gif .

 

I tried using RawTherapee, but it's even more of a mystery to me than Gimp. I don't know if I have applied the correction properly, or if I need to somehow take a calibration image of a grey card, or what that even looks like. Even so, I took the image above, applied a gamma correction of 0.45 to undo the gamma-2.2 from the display, then sharpened and stretched it a bit to give the images below. The resulting colour cast is certainly ... different scratchhead2.gif

 

Andrew

It's something of a mystery to me too, Andrew.  Much of the gamma operations are display dependent, and also software dependent, and sometimes applying the same gamma alterations in different editing software produces a different looking result.  And all gamma is somewhat artificial anyway on modern LCD screens.  Back in the day of CRT displays, the electron gun itself displayed energy losses that followed an exponential curve, causing the output images to be much darker (and nonlinear) compared to the input image.  So it was discovered that if you altered the input image by applying a non-linear curve in the opposite direction to the CRT's natural curve, the two curves offset each other and the output image looked correct when displayed.  Today, LCD screens don't need this, but gamma is still used for a variety of reasons (including compatibility with older data, but also some other reasons.......namely, more visible shades of gray can be fit into lower bit depth monitors this way).  But, different formats use different gamma settings.....broadcast television for example uses different settings from cinematic film, because they are making assumptions about the final display, and environment in which the product will be viewed.  Ultimately, however, there is always an editing step in which somebody has to decide if the output "looks right" on the final display medium.  So, despite your best intentions to find something formulaic, I think we will always run into some confusing issues here that will limit the absolute truth of any outcome.  


Edited by Tom Glenn, 28 October 2020 - 05:51 PM.

  • Tulloch likes this

#24 Kiwi Paul

Kiwi Paul

    Vanguard

  • -----
  • Posts: 2,314
  • Joined: 13 Jul 2020
  • Loc: Carterton, New Zealand

Posted 28 October 2020 - 06:00 PM

Andrew, I have been mucking around much like you. I managed to image a reference palette of spectrum colours and then took an image of the same target with my 80mm achromatic refractor and then in GIMP I created a colour palette of the reference spectrum and then mapped this palette onto refractor image (which had shifted colours) and it actually brought the shifted colours back towards the ideal palette. The idea is to work towards a solution for the 8 inch achromat of my refractor. So now I know how to map a colour scheme onto another image. So I thought why not create a ‘correct’ colour palette from a Jupiter image taken with my C8 (correct colours) and map this onto a Jupiter image from the 8 inch. Well I was able to get a result that shifted much of the purple colours of the belts to weaker more neutral colours. However it is not right, but getting closer to the ‘truth’. There didn’t seem to be anywhere near the colour range I would expect. I need to do more controlled experiments. One thing that really surprised me in the true Jupiter colour palette was how washed out looking the GRS colours were. Whereas in the image to the eye it has a definite red/orange tone, in the palette it looked more like a light brown (from memory). Anyway, I am away from home for a week and so will have to continue this when I get home. I switched to the 80mm to experiment on because I couldn’t locate my colour target far enough away from the big scope! Also I needed to use a much smaller colour target.
Cheers Paul
  • Tulloch likes this

#25 sciguy125

sciguy125

    Sputnik

  • -----
  • topic starter
  • Posts: 49
  • Joined: 29 Dec 2016

Posted 28 October 2020 - 09:08 PM

First, Tom's correct about Jupiter being low contrast. If you go through the JunoCam gallery, there's a couple people processing the images to be scientifically accurate. I came across some of their discussions on another forum when doing other research. I didn't understand everything they were talking about, but they seem to know what they're doing. I got the impression that they had color calibration info for the camera. Here's one example of a true color image of Jupiter:

 

https://www.missionj...cessing?id=8894

 

Andrew's samples are about the same was what I get when I color correct my own Jupiter data. Though, I've really only been playing with Mars so far.

 

What I've found is that the trickiest thing in all of this is gamma correction. Image processing software assumes that a file is gamma corrected sRGB unless told otherwise (either by the user or by the file's metadata). Since the raw data we're starting with is linear, feeding that directly into any image processing/rendering software causes it to appear dark. As Tom has pointed out, this tends to actually look better for planets. It's important that the software be correctly informed about whether the file has been gamma corrected or not because it affects the math. There's also a distinction between "setting as linear" and "converting to linear". In one case, the data isn't changed but the software interprets it as linear. In other, it modifies the data before changing it's interpretation of it.

 

I was running my initial experiments with GIMP, but I moved to command line work with ImageMagick. I know exactly what ImageMagick is doing because I have to tell it what to do step by step. I explicitly tell it that the input data is linear RGB, apply the profile, then convert it to gamma corrected sRGB (without an embedded profile). That gives me a color corrected file that I can feed into other software without problems. My Mars samples above were done that way.

 

Since my last post, I've learned an interesting trick - the final histogram stretch (which boosts contrast) should be done in HSV space. That allows you to stretch the brightness without affecting the hues. I found that doing the stretch in RGB caused significant color shifts when if I significantly raise the black point. I've managed to wrap my head around why a stretch in HSV works, but I don't completely understand why it doesn't work in RGB.

 

To do an HSV stretch in GIMP, Colors->Components->Decompose and choose HSV as the color model. Turn off the hue and saturation layers so you can see the value layer. The correct layer is obvious. The value layer looks like a nice grayscale version of the RGB image while the other two look like garbage. Stretch the value layer using either levels or curves. To go back to RGB, Colors->Components->Compose and set the color model to HSV.




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics