Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

DSLR Processing - The Missing Matrix

  • Please log in to reply
86 replies to this topic

#26 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 17 March 2016 - 04:38 PM

Look can I ask a stupid question.

 

I've noticed my jpg's can be enhanced better than my canon CR2's (canon 450D) I have seen this...way too much.  But where can I apply these RGB off sets?  In DSS? Photo-shop?  Sorry for the ignorance!  I'm struggling with processing!

 

Brendan

 

Raw converters will do it automatically anyway e.g. if you open the raw file in Photoshop.  There is currently no way to apply this matrix in DSS, IRIS, PixInsight etc. without writing scripts or pixel arithmetic.

 

Mark



#27 polaris-14

polaris-14

    Lift Off

  • -----
  • Posts: 16
  • Joined: 15 Aug 2016

Posted 29 January 2017 - 07:04 PM

Hi Mark,

 

My apology for reviving this old topic. But I am wondering if this color matrix is somewhat related to the debayering process? I am having a hard time balancing color obtained through my Nikon D750 and I wonder if this could be the solution to my problem. My workflow is that I will debayer after I pre-process my light frames. All these happen in Nebulosity 3 (I am using Mac). I usually end up with image that is mostly green. In the debayering panel on Nebulosity, we can fill our own color matrix. Right now, it's defaulted to the identity matrix. I am wondering if this is the right way to do that. 



#28 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 30 January 2017 - 05:17 AM

Your problem is simply a question of white balance. Generally speaking everything will look very green until white balance is applied.

The colour matrix is an additional level of complication that can be safely ignored for astrophotography in general unless you want a colour calibrated workflow.

Mark

#29 polaris-14

polaris-14

    Lift Off

  • -----
  • Posts: 16
  • Joined: 15 Aug 2016

Posted 30 January 2017 - 06:00 AM

So if the color matrix is not done in the debayering process, how is it done then? I am sorry if it is rather obvious from the above, I am still trying to get my post processing workflow (photoshop) sorted out. What I have so far is extremely unrepeatable for the color.

#30 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 30 January 2017 - 02:40 PM

I'm not familiar with Nebulosity but you need to get 2 things right:

1) Select the right Bayer matrix for debayering i.e. GBRG or GRBG or BGGR or RGGB

2) Define the white balance multipliers appropriate to your camera e.g. R:G:B = 1.5:1.0:1.7 (these are NOT the D750

In any case, why don't you start a new thread for your question rather than re-opening a old thread on an unrelated topic because folk won't think of looking here ;)



#31 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 06 September 2017 - 12:38 AM

There is one more piece of the puzzle that I now realise need to be put into place.  In a standard raw convertor (the software that takes a raw file and turns it into a JPG or displays it on the screen) there are the following main steps:

 

1) Debayer

2) White balance

3) Set black point and white point

4) Apply camera specific colour matrix

5) Apply gamma (or sRGB tone curve) so the image displays properly

 

One of the things Jon noted is that the colours look too saturated after applying the colour matrix.  I've noticed the same thing.  I now realise this is because the gamma has not been applied.  But although the gamma curve is important to make your average photo display properly, the gamma curve is not an ideal intensity transformation for astro images because they have such a high dynamic range.  This is a weakness of the Tony Hallas/Roger Clark approach to processing because steps 1-5 are automatically applied by their raw convertor and it kills star colour.

 

The effect of the applying the gamma (or sRGB tone curve) is twofold:

1) It changes the RGB ratios within each pixel making them less saturated

2) It applies an intensity transformation across the whole image

 

I now think there is way to apply the "gamma" change to the RGB ratios so the colours display correctly without too much saturation without applying the "gamma" intensity transformation.  The intensity transformation could then be applied using a colour preserving stretch such as the Arcsinh Stretch.

 

Mark


  • Jon Rista, RareBird and tkottary like this

#32 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 31,826
  • Joined: 27 Oct 2014

Posted 06 September 2017 - 09:29 AM

I find it interesting when people are trying to be so precise about this.  Because of things like the nature of our eyes (major thing) and things like the variability of our displays (minor thing), the use of color in AP is always going to be somewhat subjective.  (Note the "things like" phrase, there's much more.)  As has often been pointed out, a "true" portrayal of dim objects would always be grey.

 

Note that this discussion is littered with phrases like "too saturated".  Isn't that in the eye of the beholder?

 

If the method is intended to get color you personally prefer, fine.  Or if the goal is noise reduction.  But if you're striving for something everyone will agree is "real", I think that's an impossible goal.

 

I personally throw science overboard here, use simple standard methods like ColorCalibration in PixInsight (which can be applied in a variety of ways) to get me in the ballpark, maybe SCNR (PixInsight, the Color tool in StarTools has something similar) to reduce green, and do final tweaking to my taste.


Edited by bobzeq25, 06 September 2017 - 09:31 AM.


#33 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 07 September 2017 - 01:33 AM

I understand the point you are making and of course it is true that it is the decision of each individual how much saturation to apply their images.

 

On the other hand there is a sense in which saturation can be measured objectively.  I can apply the same set of image development steps to an astro-image, a picture of my wife walking the dog in the woods and a ColorChecker test chart. If this set of steps results in a test chart that is over-saturated and a dog that is over-saturated then we can probably conclude that the astro-image is also over-saturated.

 

Where I'm going with this is that I am interested to see on my monitor images of common astro-objects, "correctly" colour balanced and with "neutral" saturation.  This would hopefully represent how the human eye would perceive the same view if it were bright enough to trigger our colour receptors.

 

To be honest I think the result will look fairly lifeless and not to my taste but I'm interested to find out, all the same. 

 

At present I'm also playing with PixInsight's new PhotometricColorCalibration process which plate solves an image, downloads the star colours from an online database then solves for the black level and white balance that gives the best fit.  It is a tool with huge potential and it works amazingly well.  However it is currently giving me a colour balance that is much more green than the colour balance obtained using the ColorChecker test chart.  I'm trying to resolve this apparent discrepancy.

 

Mark


Edited by sharkmelley, 07 September 2017 - 01:35 AM.

  • Jon Rista and RareBird like this

#34 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 31,826
  • Joined: 27 Oct 2014

Posted 07 September 2017 - 08:36 AM

Have you looked at the effect of running SCNR on that too green image?  For me, it works great in that situation.

 

One difference between terrestrial and AP is that we stretch our images.   That process inevitably reduces saturation.  I suppose one could try to figure out how much and dial in the "proper" correction.

 

An interesting philosophical topic is whether astronomical subjects are inherently less or more saturated than terrestrial.  I'd think an emission nebula would be quite saturated, it's fundamentally the same thing as a Day Glo poster under UV light.

 

That's where I went with my icon, an image which has gotten comments of "too red", and "really captures the glow".

 

In the words of a noted astrophotographer.

 

https://www.youtube....h?v=rFugRFKqjFg

 

<grin>


Edited by bobzeq25, 07 September 2017 - 08:43 AM.


#35 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,372
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 07 September 2017 - 10:38 AM

The new PCC in PixInsight is trying to calibrate RGB filters based on photometry data acquired with a standard UBVRI set. The bandpasses of the former do not entirely match the bandpasses of the latter, and I think the discrepancy results in the green shift we see when calibrating color with PCC. I keep meaning to write Juan and ask him about that, and if there is any standardized PixelMath formula we might be able to run (i.e. another matrix) that would recalculate the color to account for the discrepancy in the reference photometry vs. more common LRGB filter sets. 


Edited by Jon Rista, 07 September 2017 - 10:38 AM.

  • sharkmelley likes this

#36 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 07 September 2017 - 11:05 AM

The new PCC in PixInsight is trying to calibrate RGB filters based on photometry data acquired with a standard UBVRI set. The bandpasses of the former do not entirely match the bandpasses of the latter, and I think the discrepancy results in the green shift we see when calibrating color with PCC. I keep meaning to write Juan and ask him about that, and if there is any standardized PixelMath formula we might be able to run (i.e. another matrix) that would recalculate the color to account for the discrepancy in the reference photometry vs. more common LRGB filter sets. 

I think you are probably right.  PCC is trying to fit quantities such as (Johnson B) - (Johnson V) and (Sloan r'- Johnson V) from stars in the astronomical database to the equivalent quantities seen by our own RGB camera filters.  So if the bandpasses are different then yes I suppose it could result in the green cast we are seeing.  I need to think about that in a bit more detail.

 

I did ask the question earlier, on the PI Forum:

https://pixinsight.c...php?topic=11494

 

Mark


Edited by sharkmelley, 07 September 2017 - 11:07 AM.


#37 t_image

t_image

    Gemini

  • -----
  • Posts: 3,499
  • Joined: 22 Jul 2015

Posted 07 September 2017 - 10:59 PM

I find it interesting when people are trying to be so precise about this.  Because of things like the nature of our eyes (major thing) and things like the variability of our displays (minor thing), the use of color in AP is always going to be somewhat subjective.  (Note the "things like" phrase, there's much more.)  As has often been pointed out, a "true" portrayal of dim objects would always be grey.

 

Note that this discussion is littered with phrases like "too saturated".  Isn't that in the eye of the beholder?

 

If the method is intended to get color you personally prefer, fine.  Or if the goal is noise reduction.  But if you're striving for something everyone will agree is "real", I think that's an impossible goal.

 

I personally throw science overboard here, use simple standard methods like ColorCalibration in PixInsight (which can be applied in a variety of ways) to get me in the ballpark, maybe SCNR (PixInsight, the Color tool in StarTools has something similar) to reduce green, and do final tweaking to my taste.

I find it interesting that there can't be a discussion on CN about color without someone needing to share ideological notions of why use of color in AP has to be subjective.

 

The current questioner said "I am still trying to get my post processing workflow (photoshop) sorted out. What I have so far is extremely unrepeatable for the color. "

 

"But if you're striving for something everyone will agree is "real", I think that's an impossible goal.

Bob, you're free to share your opinion,

but out of the blue projecting some notion of "real" as the final objective for us and then construction the notion of a "true" portrayal to fit your lack of concern for color precision other than what appeals to your taste doesn't really seem useful here.

A "true portrayal" is a red herring. there is no 1:1 scale, no 3-D, etc. in ANY 2D image. That's a given....Not a useful point IMO.

 

IMO someone who lacks care in color precision in recording what is up there in the heavens,

is close in matter to someone who lacks care in shape of objects,

and might as well apply Photoshop shape distortions to astronomical objects just like the magazines do to the curves of model's bodies to fit "one's taste."

Why care about one and not the other? Why not add a horn to Barnard 33's horseee and make it a unicorn?

 

There are tools that exists and processes that allow more consistency with handling color than I think you understand.

The motion picture industry is moving towards a landmark standardization with ACES (Academy Color Encoding System),

and video/motion picture has utilizes methods and tools (waveforms, vectorscopes,etc.) color calibration tools, color charts, color spaces, gamuts, display gammas for a long while now.

 

Technology is moving towards High Dynamic Range displays necessarily as more and more phone displays require use in direct sunlight.

Many experts see it as more revolutionary that what higher resolution brought. As displays develop a wider range of absolute luminance,

the ability to display a larger volume of colorfulness (chroma) will come about.
Soon enough having a "calibrated displays" might be the norm. Adding a display facing sensor and compensating for ambient light to auto-calibrate isn't such a challenge.

Images not handled well now with color,

and those that never develop a skill for handling color consistently,

might just be left by the wayside just a few years down the road.

Like it or not, what we do now with digital photography processing is done on what are the offspring of TV sets,

where video set the paradigm of how color is handled....
And the trend towards accuracy, precision, repeat-ability and consistency in workflows with color and tonal range is moving light speed over there in that realm.

.

The technology (sensors, filters to help register color, methods to encode the color/tone data, tools to process, methods to transform the data to displays, displays)

is still in quite a level of infancy.

 

But I don't imagine astrophotography will move closer towards "abstract art" circles as we move ahead....
People can spread their color-nihilistic ideologies on CN all they want,

but the future is coming.....

.....And a discipline based on precision and accuracy in measurements and passion for describing what is up there correctly is right next door to AP.

Some call is astronomy.



#38 t_image

t_image

    Gemini

  • -----
  • Posts: 3,499
  • Joined: 22 Jul 2015

Posted 07 September 2017 - 11:56 PM

There is one more piece of the puzzle that I now realise need to be put into place.  In a standard raw convertor (the software that takes a raw file and turns it into a JPG or displays it on the screen) there are the following main steps:

 

1) Debayer

2) White balance

3) Set black point and white point

4) Apply camera specific colour matrix

5) Apply gamma (or sRGB tone curve) so the image displays properly

 

One of the things Jon noted is that the colours look too saturated after applying the colour matrix.  I've noticed the same thing.  I now realise this is because the gamma has not been applied.  But although the gamma curve is important to make your average photo display properly, the gamma curve is not an ideal intensity transformation for astro images because they have such a high dynamic range.  This is a weakness of the Tony Hallas/Roger Clark approach to processing because steps 1-5 are automatically applied by their raw convertor and it kills star colour.

 

The effect of the applying the gamma (or sRGB tone curve) is twofold:

1) It changes the RGB ratios within each pixel making them less saturated

2) It applies an intensity transformation across the whole image

 

I now think there is way to apply the "gamma" change to the RGB ratios so the colours display correctly without too much saturation without applying the "gamma" intensity transformation.  The intensity transformation could then be applied using a colour preserving stretch such as the Arcsinh Stretch.

 

Mark

So what you describe here would be so nice if PixInsight would adopt the ACES system of working with color.

1. The camera manufacturer, or colorscientists figure out a IDT (Input Device Transform) for a specific camera (based on its specs and color filter array, etc)....to get it into the linear space ACES deals with.

2. Then one could process the data at a linear, floating point precision level in a place where nothing is lost (color, tones, resolution).

3. a favored LMT (Look Modification Transform) could be applied by the artist.

4. The ODT (Output Device Transform) would be applied based on the delivery display device.

All of this can be done currently with a freeware version of Davince Resolve by BlackMagicDesign. It's a video tool that can easily handle RAW still files (maybe DNG).....

In fact, the open source OpenEXR file format for images would work so well with our hobby, including being a container for future-proofing the data in both colorspace and dynamic range of ACES2065 (more colors than we can see and 30 stops of dynamic range, resolution independent).

 

https://mixinglight....lor-management/


  • sharkmelley likes this

#39 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 31,826
  • Joined: 27 Oct 2014

Posted 08 September 2017 - 12:08 AM

Disagree with #37 above.  Astrophotography is nothing at all like terrestrial.  Once you start stretching images, "real" color is out the window. 

 

That view is not my invention, I'm hardly alone.  From "Lessons from the Masters".

 

3rd chapter "Intensifying Color"

 

"All white light color images of deep space objects are metaphors that exaggerate the intensity of the subject's true hues.  So what is the correct color saturation for an astronomical image?  As much as you want so long as you avoid introducing color noise.  There is no "correct" amount of color saturation for astronomical images - it's a discretionary decision."

 

11th Chapter (all have different authors) "Deep Sky Imaging Workflow 3"

 

"When I first started color processing ten years ago I was certain that one day the "Color police" would raid my dome and confiscate the equipment.  I played around with G2v balancing but no longer do so.  These days I just use my judgement."

 

Note also that LRGB imaging necessarily distorts color (mathematics), which is why Juan Conejero, the developer of PixInsight thinks it's inferior for color to RGB, which, he admits, takes a lot more time.

 

Light pollution filters have similar (but worse) issues with color.  Some of the color data is just gone.  Compensating for that is necessarily subjective.

 

Color in astrophotography is subjective, and will remain so for the foreseeable future.


Edited by bobzeq25, 08 September 2017 - 12:21 AM.

  • poobie likes this

#40 Traveler

Traveler

    Aurora

  • *****
  • Posts: 4,523
  • Joined: 19 Aug 2007
  • Loc: The Netherlands

Posted 08 September 2017 - 12:15 AM

I am trying to follow this thread and have a couple of practical questions.

 

- Say i have found the matrix values of my camera (Nikon D5500) and i want to do a matrix color transformation, when do i have to that, before (with all my individual RAW lights) or after stacking?

 

- What tool can i use to do such a matrix transformation, is it possible to do this with Adobe Photohop CS2, Nikon ViewNX or others (i don't use PixInsight?

 

- Can someone show the detailed steps which have to be made to make such a matrix color transformation?

 

Talking about color ethics is interesting but i feel it is more suitable to talk about it in a seperate thread imo.


  • bobzeq25 likes this

#41 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 08 September 2017 - 12:51 AM

Disagree with #37 above.  Astrophotography is nothing at all like terrestrial.  Once you start stretching images, "real" color is out the window. 

 

[SNIP]

 

Color in astrophotography is subjective, and will remain so for the foreseeable future.

No!  "Real" colour is not necessarily out of the window once you stretch.  That is the whole point of my colour preserving Arcsinh Stretch module which you can find elsewhere on Cloudy Nights and will hopefully soon be released as part of the PixInsight distribution thanks to Juan's support.  It makes it possible to perform a crazy amount of stretching without compromising colour.

 

The Roger Clark and Tony Hallas style of processing boasts that it uses a colour calibrated workflow because the raw converter does it automatically for them.  The whole point of what I'm trying to do here is to identify the steps required to make a "traditional" RGB workflow (i.e. lights, flats, bias etc.) into a colour calibrated RGB workflow.  I believe I'm now very close to this goal.  And it will do this without performing the dubious step of subtracting light pollution from non-linear data which is what typically takes place in the Clark/Hallas approach.

 

We can debate the philosophy of which colour balance and saturation is best for displaying astro-images elsewhere. 

 

Mark


  • poobie and bobzeq25 like this

#42 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 08 September 2017 - 01:05 AM

 

The new PCC in PixInsight is trying to calibrate RGB filters based on photometry data acquired with a standard UBVRI set. The bandpasses of the former do not entirely match the bandpasses of the latter, and I think the discrepancy results in the green shift we see when calibrating color with PCC. I keep meaning to write Juan and ask him about that, and if there is any standardized PixelMath formula we might be able to run (i.e. another matrix) that would recalculate the color to account for the discrepancy in the reference photometry vs. more common LRGB filter sets. 

I think you are probably right.  PCC is trying to fit quantities such as (Johnson B) - (Johnson V) and (Sloan r'- Johnson V) from stars in the astronomical database to the equivalent quantities seen by our own RGB camera filters.  So if the bandpasses are different then yes I suppose it could result in the green cast we are seeing.  I need to think about that in a bit more detail.

 

 

I've now thought about it in detail and run some tests on PCC.  The issue is not caused by differing filter bandpasses.  If I'm supplying a properly daylight white balanced image to PCC then the population of G2V stars will be white in that image.  So looking at the graphs produced by PCC, on the B-V graph at the point on the x-axis where G2V stars appear (i.e. B-V is around 0.6) the camera B-G will be precisely zero because the star will be measured as white.  And when finally it comes to reading the G2V adjustment from the linearly regressed curve it will also be zero.

 

I'm now convinced that PCC is working correctly but for some reason the measurement of stars in the image has too little green.  Maybe there is some operation in my processing workflow that does not properly preserve flux in stars.  Anyway I'll park this for later because it's not relevant to the current discussion.

 

Mark



#43 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 25,372
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 08 September 2017 - 05:27 PM

 

 

The new PCC in PixInsight is trying to calibrate RGB filters based on photometry data acquired with a standard UBVRI set. The bandpasses of the former do not entirely match the bandpasses of the latter, and I think the discrepancy results in the green shift we see when calibrating color with PCC. I keep meaning to write Juan and ask him about that, and if there is any standardized PixelMath formula we might be able to run (i.e. another matrix) that would recalculate the color to account for the discrepancy in the reference photometry vs. more common LRGB filter sets.

I think you are probably right.  PCC is trying to fit quantities such as (Johnson B) - (Johnson V) and (Sloan r'- Johnson V) from stars in the astronomical database to the equivalent quantities seen by our own RGB camera filters.  So if the bandpasses are different then yes I suppose it could result in the green cast we are seeing.  I need to think about that in a bit more detail.

 

I've now thought about it in detail and run some tests on PCC.  The issue is not caused by differing filter bandpasses.  If I'm supplying a properly daylight white balanced image to PCC then the population of G2V stars will be white in that image.  So looking at the graphs produced by PCC, on the B-V graph at the point on the x-axis where G2V stars appear (i.e. B-V is around 0.6) the camera B-G will be precisely zero because the star will be measured as white.  And when finally it comes to reading the G2V adjustment from the linearly regressed curve it will also be zero.
 
I'm now convinced that PCC is working correctly but for some reason the measurement of stars in the image has too little green.  Maybe there is some operation in my processing workflow that does not properly preserve flux in stars.  Anyway I'll park this for later because it's not relevant to the current discussion.
 
Mark

 


I have been playing around as well. I am starting to think that the star saturation limit is actually CRITICALLY important. It defaults to 0.95, but I think that much lower values may be necessary to properly calibrate most images. This also PARTICULARLY true with DSLR data. It seems that DSLR data is loaded unscaled into 16-bit numeric space. Since most DSLRs are 14-bit, this results in the clipping point being well below 65535 DN. So a star clipping threshold in PCC of 0.95 is not going to exclude ANY saturated or near-saturated stars from calibration, at all. You will want to measure what your clipping point is, in all three color channels, and pick the lowest value from those measurements. That may be as little as 0.2 for some cameras and channels!! And you will want to use a value even lower than that to get better results, as PCC does some kind of dynamic weighting, rather than outright exclusion of clipped or nearly saturated stars.

I am still experimenting, and have not found the right settings for my older DSLR data yet. I still get a bit of a green cast. Maybe that is just doe to the nature of the bayer array, I am not sure. I actually want to ask Juan about it, as I figure the demosaicing process should be accounting for the fact that there are twice as many green pixels.



#44 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 09 September 2017 - 02:27 AM

I have been playing around as well. I am starting to think that the star saturation limit is actually CRITICALLY important. It defaults to 0.95, but I think that much lower values may be necessary to properly calibrate most images. This also PARTICULARLY true with DSLR data. It seems that DSLR data is loaded unscaled into 16-bit numeric space. Since most DSLRs are 14-bit, this results in the clipping point being well below 65535 DN. So a star clipping threshold in PCC of 0.95 is not going to exclude ANY saturated or near-saturated stars from calibration, at all. You will want to measure what your clipping point is, in all three color channels, and pick the lowest value from those measurements. That may be as little as 0.2 for some cameras and channels!! And you will want to use a value even lower than that to get better results, as PCC does some kind of dynamic weighting, rather than outright exclusion of clipped or nearly saturated stars.

I am still experimenting, and have not found the right settings for my older DSLR data yet. I still get a bit of a green cast. Maybe that is just doe to the nature of the bayer array, I am not sure. I actually want to ask Juan about it, as I figure the demosaicing process should be accounting for the fact that there are twice as many green pixels.

 

Yes, the saturation level is critical for PCC - you need to set it at the right level. 

That's also the point of step 3 in my sequence above:

"3) Set black point and white point"

Without that step you typically see pink centres in the saturated stars.

 

Demosaicing will always account for the fact that there are twice as many green pixels.  There's no issue there. It simply "fills the gaps" by interpolation which doesn't affect the white balance.

 

Mark


Edited by sharkmelley, 09 September 2017 - 02:28 AM.


#45 chancy

chancy

    Lift Off

  • *****
  • Posts: 20
  • Joined: 10 Jun 2015
  • Loc: Singapore

Posted 11 September 2017 - 09:11 AM

Not sure if this is relavant in any way, but there are tools to help calibrate colours for DSLR eg Spyder Colorchecker. I have not used this before so just wondering if it may be helpful.

 

According to what I understand from the product video, basically a photo is taken with this colour chart and then using the  Spyder reference colour, a profile i generated to match the colour taken with the DSLR to the reference colour by Spyder.

 

If we calibrate in sunlight as a reference, then apply the profile to the astrophotos, will these then be colour correct? If so, it may also help with OSC colour calibration as OSCs (especially the CMOS ones) do not have such a matrix or colour balance in built into the software on board, and I think no raw converter presently has any such built in matrixes either.

 

http://www.photorevi...spydercheckr-24



#46 KLWalsh

KLWalsh

    Surveyor 1

  • *****
  • Posts: 1,696
  • Joined: 19 Feb 2014
  • Loc: North Georgia, USA

Posted 11 September 2017 - 06:43 PM

Interesting subject, fascinating analysis, and thought-provoking discussion.
I work with CIE formula all the time with AMLCDs and backlights for cockpits. Some of the color analysis stuff is straightforward, but detailed color matching can be a real challenge.

The one comment I have to all of this is that color is designed for a 'Two Degree' observer. I.e., the CIE color formula work for defining colors and matching colors when the object being viewed is uniformly illuminated at a moderate room-level brightness over a region that subtends two degrees L-R and U-D. There are NO astronomical objects in the night sky that meet these criteria.
So while it's an interesting exercise to try to create the 'real' color of an object, ultimately the final color is a mix of subjective preference on top of quasi-real data.

Edited by KLWalsh, 11 September 2017 - 06:44 PM.


#47 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 6,380
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 12 September 2017 - 12:02 AM

The one comment I have to all of this is that color is designed for a 'Two Degree' observer. I.e., the CIE color formula work for defining colors and matching colors when the object being viewed is uniformly illuminated at a moderate room-level brightness over a region that subtends two degrees L-R and U-D. There are NO astronomical objects in the night sky that meet these criteria.
So while it's an interesting exercise to try to create the 'real' color of an object, ultimately the final color is a mix of subjective preference on top of quasi-real data.

I'm not quite clear what you are saying.  The sun is an astronomical object.  Does the sun meet your criteria (maybe assuming we can attenuate its brightness so it doesn't burn the retina)? Are you saying that we are unable to determine the colour of the sun?  By extension are you therefore saying that we are unable to determine the colour of stars in the night sky?  Are you implying that the efforts of professional astronomers to catalogue star colours are flawed?

 

Mark


Edited by sharkmelley, 12 September 2017 - 12:12 AM.

  • Jon Rista likes this

#48 t_image

t_image

    Gemini

  • -----
  • Posts: 3,499
  • Joined: 22 Jul 2015

Posted 12 September 2017 - 03:36 AM

Interesting subject, fascinating analysis, and thought-provoking discussion.
I work with CIE formula all the time with AMLCDs and backlights for cockpits. Some of the color analysis stuff is straightforward, but detailed color matching can be a real challenge.

The one comment I have to all of this is that color is designed for a 'Two Degree' observer. I.e., the CIE color formula work for defining colors and matching colors when the object being viewed is uniformly illuminated at a moderate room-level brightness over a region that subtends two degrees L-R and U-D. There are NO astronomical objects in the night sky that meet these criteria.
So while it's an interesting exercise to try to create the 'real' color of an object, ultimately the final color is a mix of subjective preference on top of quasi-real data.

I'm curious what you do with a self-illuminated button on a cockpit console in the dark.

Surely color correction models also apply to transmitted light sources and not just a scene with only reflected light as you narrowly define in the statement above....

 

And do you mean two whole angular degrees? So if I take a photo of a terrestrial object that doesn't subtend two degrees, there is no point in color correction?

How does the "two degree" assumption that the 1931 CIE model was based on relate to the ability of a sensor to register color?That's a non sequitur.

Please research why CIE added that to the definition.

Additionally, one can change the focus of a star to increase FWHM to register more surface area, yes? (of course assuming CA and other optical issues are corrected).....

 

Additionally to say that some 1931 CIE model makes color correction impossible with stars therefore it makes things subjective, fails in logic: false choice.

Even if your premise were correct, color measurement and transformations to display it can be done objectively.

Spectrometers can objectively measure wavelength emitted by a radiant source.

Funny enough, astronomers do this with stars.....That's real data, yes? The subjective preference of the operator has no bearing on the results.

Displays can be color calibrated by electronic tools, yes?

A sensor with calibrated filters is doing the same thing as a spectrometer in a limited way, yes?

 

Again, I don't understand why ideological arguments (filled with fallacies and conclusions that don't follow from true premises) always have to be offered on CN threads every time someone wants to get precise about color?

For the record, arguing the existence of some lack of degree of precision doesn't negate what Mark is attempting to do.

That's tantamount to saying nothing is worth doing because nothing is 100% perfect. Give it a rest.


  • sharkmelley and Jon Rista like this

#49 KLWalsh

KLWalsh

    Surveyor 1

  • *****
  • Posts: 1,696
  • Joined: 19 Feb 2014
  • Loc: North Georgia, USA

Posted 12 September 2017 - 08:29 PM

Wow, had no idea I was going to annoy people with my comments.
I merely stated a fact. The CIE color analysis is designed around a 2 degree observer. That's the name of the color functions. You can measure anything you like with a colorimeter or spectroradiometer and apply the CIE X,Y, and Z functions and obtain an x,y chromaticity, etc. My point is that when you do that with an object that falls outside the CIE definitions (and people do it all the time), the color you describe is not rigorously correct. I.e., another person with equivalent equipment might end up with a different color description, and neither can be technically described as more accurate than the other.

Where I work I have a NIST traceable lamp that is calibrated yearly at a NIST certified lab for luminance and spectral intensity from the UV through the VIS and into the IR range. I then use that as a cal standard for spectroradiometers and colorimeters.
How many people calibrate their telescope and camera systems in a similar manner? Only a handful, I bet. But that doesn't mean that trying to achieve 'real color' is a waste of time. It just means that in the final analysis there is some subjectivity involved.
There is no reason to get upset about this.

All of this has nothing to do with the Johnson B-V color of a star. That's a thoroughly documented, rigorous system that works for its intended purpose when its own set of rules is applied.

Edited by KLWalsh, 12 September 2017 - 08:31 PM.


#50 t_image

t_image

    Gemini

  • -----
  • Posts: 3,499
  • Joined: 22 Jul 2015

Posted 19 September 2017 - 04:19 AM

KLWalsh,

No worries, not annoyed from my end.

Thanks for the reply. I appreciate the further information you provided.

I am still curious concerning your point here:

......and apply the CIE X,Y, and Z functions and obtain an x,y chromaticity, etc. My point is that when you do that with an object that falls outside the CIE definitions (and people do it all the time), the color you describe is not rigorously correct. I.e., another person with equivalent equipment might end up with a different color description, and neither can be technically described as more accurate than the other.

The CIE model you refer is based on correlation with human vision, yes? 2degrees was set as the 1931 standard for the initial experiments because it is the angular size of human fovea?

Regardless, cameras and equipment can be calibrated to handle color information in ways that can be standardized through a workflow,

as ACES is moving towards.

https://en.wikipedia...Encoding_System

Even if the 2 degrees in a definition standard or a human visual limitation,

it doesn't necessarily equate to the inability for sensors/storage/display to have a "common color description" utilizing scene referred measurements and transforms that can be matched.....

However if one demonstrated that equipment cannot handle a consistent measurement of something smaller than 2 degrees, that would be a different story.....

 

I mean, it shouldn't be about what color people see, should it? What about that dress? Black and blue or white and gold???

{just realized I could change my perception of the famous dress color just by adjusting the backlight brightness of my laptop display...}




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics