Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

DSLR Processing - The Missing Matrix

  • Please log in to reply
86 replies to this topic

#76 Alen K

Alen K

    Vanguard

  • *****
  • Posts: 2,013
  • Joined: 25 Nov 2009

Posted 10 April 2021 - 01:49 PM

My thinking on this has evolved quite considerably since this thread started. 

 

With linear data everything is actually very easy if you are working with a linear profile.  With a linear profile the display device is driven so that the brightness of a pixel on the screen is directly proportional to the data value and the image therefore appears correct to the eye.  Photoshop and Affinity Photo both use linear profiles in 32bit mode e.g. sRGB Linear or AdobeRGB Linear.  PixInsight can be also be set use a linear profile in its ColorManagement options (as long as you have an appropriate linear ICC Profile file available). 

 

With a linear profile, as soon as you have white balanced the data and applied the colour correction matrix appropriate for the camera and the working colour space then the colours you see on your display device are the correct colour-managed colours.  You can carry on working with the data in the linear domain and the colours still display correctly as long as non-linear stretching is avoided. Colour preserving stretches such as ArcsinhStretch can be used to change the brightness of the data without affecting hue or saturation.

 

The transformation to a standard non-linear ICC Profile (such as sRGB or AdobeRGB) can be done as a final step in the processing and will apply the relevant colour space gamma without changing the appearance of the image in any way.  After this transformation the data are non-linear, of course.

 

My current processing workflow in PixInsight uses an AdobeRGB Linear profile but as I said earlier, 32bit modes in both Photoshop and Affinity Photo can be used instead. Sometime I'll get around to providing detailed instructions for those interested.

So, it is still important for the processing software to apply the colour correction matrix appropriate for the camera (DSLR or mirrorless). What astro-processing software today still does not do this? Does PI do it? Does APP do it? And out of interest, did older apps like ImagesPlus do it? (Since it is now free, I have wondered.)

 

BTW, count me as another individual interested in further comments and instructions you may have about Affinity Photo, including any comments or analysis of James Ritson's macro for implementing a colour-preserving stretch, which he describes as "similar" to arcsinh stretch. And maybe if it isn't close enough you might be persuaded to figure out how to do it. :D


  • galacticinsomnia likes this

#77 galacticinsomnia

galacticinsomnia

    Gemini

  • *****
  • Posts: 3,286
  • Joined: 14 Aug 2020
  • Loc: Pacific Northwest - Oregon

Posted 10 April 2021 - 09:15 PM

I have the software and dabbled.  Will see what I can do..

Clear Skies !

 



#78 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 7,435
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 11 April 2021 - 02:29 AM

So, it is still important for the processing software to apply the colour correction matrix appropriate for the camera (DSLR or mirrorless). What astro-processing software today still does not do this? Does PI do it? Does APP do it? And out of interest, did older apps like ImagesPlus do it? (Since it is now free, I have wondered.)

There is no astro-processing software I am aware of that applies the CCM (colour correction matrix).  The CCM is applied only by raw converters such as those in Photoshop, LightRoom, RawTherapee, AffinityPhoto.. 

 

One exception is StarTools which does have the option to apply the CCM for some cameras but despite this it has proved impossible for me to obtain the correct colours because applying the CCM significantly distorted the white balance for some reason. Ivo, the author, once provided an example where he had to do something very weird to the colour balance to get around the problem:

https://www.cloudyni...ing/?p=10827652

 

Mark


Edited by sharkmelley, 11 April 2021 - 02:58 AM.


#79 Alen K

Alen K

    Vanguard

  • *****
  • Posts: 2,013
  • Joined: 25 Nov 2009

Posted 11 April 2021 - 02:46 PM

There is no astro-processing software I am aware of that applies the CCM (colour correction matrix).  The CCM is applied only by raw converters such as those in Photoshop, LightRoom, RawTherapee, AffinityPhoto.. 

May I ask then how you are dealing with the issue in your own processing?

 

You convinced me of the importance of the CCM from your first post starting this thread and it continues to surprise me that such a seemingly simple thing to implement (relatively speaking) continues to find no support in dedicated astrophoto processing programs. Given that Affinity Photo does support it and now supports calibrated stacking for astrophotos, it does seem to elevate the significance of that program. But besides the 32-bit floating point output bug, I'll bet it still doesn't do everything you need, such as Bayer stacking. 



#80 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 7,435
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 11 April 2021 - 06:12 PM

May I ask then how you are dealing with the issue in your own processing?

 

You convinced me of the importance of the CCM from your first post starting this thread and it continues to surprise me that such a seemingly simple thing to implement (relatively speaking) continues to find no support in dedicated astrophoto processing programs. Given that Affinity Photo does support it and now supports calibrated stacking for astrophotos, it does seem to elevate the significance of that program. But besides the 32-bit floating point output bug, I'll bet it still doesn't do everything you need, such as Bayer stacking. 

Anything can be done in PixInsight even if it is not directly supported!  I use PixelMath to apply the CCM (colour correction matrix).  

For example:  https://www.cloudyni...file/?p=8217576

 

Although Affinity Photo will apply the CCM when it opens a raw file for normal terrestrial photography, it doesn't appear to do so in Astrophotography Stack mode, which is a great shame. 

 

Mark


Edited by sharkmelley, 11 April 2021 - 06:12 PM.


#81 Alen K

Alen K

    Vanguard

  • *****
  • Posts: 2,013
  • Joined: 25 Nov 2009

Posted 11 April 2021 - 10:51 PM

Anything can be done in PixInsight even if it is not directly supported!  I use PixelMath to apply the CCM (colour correction matrix).  

For example:  https://www.cloudyni...file/?p=8217576

 

Although Affinity Photo will apply the CCM when it opens a raw file for normal terrestrial photography, it doesn't appear to do so in Astrophotography Stack mode, which is a great shame. 

Ouch. That sounds like a dumb programming decision. After calibration, why would they NOT simply use the same conversion from raw they use in the develop persona before aligning and stacking? Time for another "bug" report. 

 

Does Affinity Photo have anything like PixelMath? At first blush, it seemed to me that the procedural texture filter could possibly be used in a similar manner since it allows user-defined equations.

https://affinity.hel...cedural Texture



#82 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 572
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 12 April 2021 - 01:10 AM

Color correction matrices are rather useless for AP, which is one of the reasons you will not find them in AP software or supplied by OSC manufacturers. The sacrilege laugh.gif  is actually much worse - did you know mono CCD astrophotographers don't bother capturing violet (not purple - violet) with their LRGB filters for example? No one bats an eyelid at that. And, really, it is for good reasons.

 

The matrices are typically derived from sampling a color chart (for example a Macbeth chart) under very specific terrestrial lighting conditions. The method of derivation is an ill posed problem to begin with (meaning that there is no one specific solution, much like deconvolution, but rather a continuum of plausible solutions). E.g. there are many different ways of constructing the matrix that all yield different, but plausible solutions.

 

There are two main assumption about the matrices that do not match outer space. One is the lighting condition (typically a D65 illuminant with matching power spectrum; midday daylight in Western/Northern Europe), the other assumption, is that everything is reflecting that illuminant.

 

These two assumptions obviously do not apply in outer space. There is no one illluminant with one specific power spectrum. Instead there are many different illuminants (for example stars, emissions, and their dust-filtered variants) with different power spectra and color temperatures. A lot of things are not reflective, but instead are emissive (emission nebulosity, stars, etc.), which is obviously not measured with a simple Macbeth chart.

 

There is also the matter of the stretch being used for the rendering medium, which us AP-ers don't usually put at the target medium's specified stretch (~2.2 gamma for an sRGB monitor for example), but much higher/different, just to make the faint detail visible; this too obviously impacts color.

 

That's the skinny. Color is a very, very deep rabbit hole full of weirdness.

 

That is not to say that anything goes when it comes to color; preserving color that has documentary value is actually one of the most overlooked aspects of AP, though fortunately we are starting to see more M42 cores in green (due to O-III dominance), more yellow galaxy cores (older stars, less star formation), and fewer red HII areas (yes Ha is red, but the other Balmer lines are blue and so is the reflection of the OB-class stars that power their emissions).

 

I implemented CCMs for some DSLR models in StarTools more as a curiosity and tech demo, rather than something that I would recommend (the tech demo bit being that it is possible in StarTools to recover the coloring and still re-weight the green channel for improved luminance SNR, thanks to ST's unique signal back and forward propagating engine). But even I would be the first to admit they have no bearing whatsoever on a "more correct" color rendering.

 

Hope this helps!



#83 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 7,435
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 12 April 2021 - 01:42 AM

Ouch. That sounds like a dumb programming decision. After calibration, why would they NOT simply use the same conversion from raw they use in the develop persona before aligning and stacking? Time for another "bug" report. 

 

Does Affinity Photo have anything like PixelMath? At first blush, it seemed to me that the procedural texture filter could possibly be used in a similar manner since it allows user-defined equations.

https://affinity.hel...cedural Texture

Yes, maybe I should report it as a bug or as a desired feature.  It doesn't make much sense to me the way it has been done.

 

Procedural texture filters are certainly one answer.  The image that comes out of the Astrophotography Stack mode is 32-bit linear with a linear profile applied.  Applying the CCM as a procedural texture filter should fix the problem and make the true colours appear.

 

Mark



#84 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 7,435
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 12 April 2021 - 02:12 AM

The matrices are typically derived from sampling a color chart (for example a Macbeth chart) under very specific terrestrial lighting conditions. The method of derivation is an ill posed problem to begin with (meaning that there is no one specific solution, much like deconvolution, but rather a continuum of plausible solutions). E.g. there are many different ways of constructing the matrix that all yield different, but plausible solutions.

 

There are two main assumption about the matrices that do not match outer space. One is the lighting condition (typically a D65 illuminant with matching power spectrum; midday daylight in Western/Northern Europe), the other assumption, is that everything is reflecting that illuminant.

 

These two assumptions obviously do not apply in outer space. There is no one illluminant with one specific power spectrum. Instead there are many different illuminants (for example stars, emissions, and their dust-filtered variants) with different power spectra and color temperatures. A lot of things are not reflective, but instead are emissive (emission nebulosity, stars, etc.), which is obviously not measured with a simple Macbeth chart.

 

There is also the matter of the stretch being used for the rendering medium, which us AP-ers don't usually put at the target medium's specified stretch (~2.2 gamma for an sRGB monitor for example), but much higher/different, just to make the faint detail visible; this too obviously impacts color.

 

That's the skinny. Color is a very, very deep rabbit hole full of weirdness.

Yes, of course it's true that the CCMs (colour correction matrices) are only approximations and are typically derived from calibrating a colour chart lit with known illuminants.  But it's incorrect to imply that it doesn't work well for emissive light sources.  Consider that terrestrial photographers are quite comfortable with capturing city nightscapes with their wide range of pretty light sources - no D65 illuminant there!  The same argument applies to images of the cosmos.

 

Also, there is no need for data stretching to affect colour - that's the point of colour preserving stretches such as arcsinh stretch.  But to be displayed correctly the data need the relevant colour space gamma applied for the profile attached to the data.  Linear data needs a linear profile (which Photoshop and Affinity Photo do automatically behind the scenes in 32-bit mode) whilst sRGB and AdobeRGB need their respective gamma curves applied.  This is where the "traditional" astro-processing workflow typically goes wrong - it displays linear data with a (non-linear) sRGB profile implicitly attached.  This is one of the reasons that a crazy amount of stretching is typically required to make the image look at all reasonable.

 

Mark


Edited by sharkmelley, 12 April 2021 - 02:25 AM.


#85 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 572
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 12 April 2021 - 05:15 AM

Yes, of course it's true that the CCMs (colour correction matrices) are only approximations and are typically derived from calibrating a colour chart lit with known illuminants.  But it's incorrect to imply that it doesn't work well for emissive light sources.  Consider that terrestrial photographers are quite comfortable with capturing city nightscapes with their wide range of pretty light sources - no D65 illuminant there!  The same argument applies to images of the cosmos.

 

Also, there is no need for data stretching to affect colour - that's the point of colour preserving stretches such as arcsinh stretch.  But to be displayed correctly the data need the relevant colour space gamma applied for the profile attached to the data.  Linear data needs a linear profile (which Photoshop and Affinity Photo do automatically behind the scenes in 32-bit mode) whilst sRGB and AdobeRGB need their respective gamma curves applied.  This is where the "traditional" astro-processing workflow typically goes wrong - it displays linear data with a (non-linear) sRGB profile implicitly attached.  This is one of the reasons that a crazy amount of stretching is typically required to make the image look at all reasonable.

 

Mark

Lots to unpack there Mark, which is why I said Color is an incredibly deep rabbit hole. lol.gif

 

But it's incorrect to imply that it doesn't work well for emissive light sources.

All light is, of course, emitted. What I'm getting at, is that a Macbeth chart will not reflect any wavelength that is not contained in its power spectrum. For example, a power spectrum that consists of one or two peaks (for example H-alpha at 656nm and H-beta at 486nm), will simply not reflect anything else (in other words this light would be bi-chromatic), as the radiant energy is only confined to those two peaks.

 

 

Consider that terrestrial photographers are quite comfortable with capturing city nightscapes with their wide range of pretty light sources - no D65 illuminant there!  The same argument applies to images of the cosmos.

Indeed, night shots with different sorts of lighting in them (say fluorescent, LED and incandescent) look very odd (though pretty) and nothing like the human eye would perceive.

It's not that you can't capture city nightscapes, it's just that the colors are off if you color correct them with the wrong white point or wrong matrix that does not accurately match the illuminant (hence at the very least a few different lighting settings on most consumer cameras for the white balance to twist the D65 illuminant-based matrix to behave like a black body radiator at different temperatures). You will find that most basic nightscape photography tutorials, start with setting a custom white balance to compensate for the yellower power spectrum, which is also what our eyes/brain would do.

 

Also, there is no need for data stretching to affect colour

  As you very likely know, Color is made up of Hue, Saturation and Brightness. You cannot have color without brightness; it is an integral part of how we perceive color. Affect brightness (by stretching) and you still affect how a color is perceived, even if you keep the RGB ratios constant. E.g. this has nothing to do with color ratio preserving stretches (see the "brown" video I linked to earlier). Color ratio preservation is neat and useful, but does not take into account perceived color change. E.g. it would be more useful to do this in a color space that is more psychovisually constant/mappable, like CIELAB space, rather than RGB space. This is in fact what StarTools does, but even that, of course, has its limits.

 

But to be displayed correctly the data need the relevant colour space gamma applied for the profile attached to the data

Unless I am misunderstanding you, RAW RGB tristimulus camera data does not come with a color space or profile; it is colorspace-less (or, looking at it differently "camera-specific"). It merely consists of digital representations of successful photon-to-electron conversions for red, (typically 2x) green and blue filtered pixels.

 

There is no stretch applied when converting Camera Space RGB to XYZ. A white balance is often applied to the RAW RGB, but that is a linear multiply (i.e. a matrix with just 3 values for each channel and the rest set to 0), not a non-linear stretch. Once in XYZ space, the matrix is applied. Only when the XYZ values are converted to the target color space, is a stretch applied.

 

Just so we're hopefully on the same page, this is what happens in an "ideal" RAW converter (as implemented by dcraw);

 

  Camera space RAW RGB -> white balance -> XYZ -> camera matrix -> corrected XYZ -> target color space

 

so for sRGB, that would be;

 

  Camera space RAW rgb -> white balance -> XYZ -> camera matrix -> corrected XYZ -> rgb (linear) -> gamma correction -> RGB (non-linear)

 

E.g. the only time a stretch is/should be applied, is at the last step when the target color space demands it.

 

All that is, unless, you are talking about an optional/proprietary extra tone mapping step "for flair", but this is obviously making the data no longer linear in a predictable way and voids it for use by algorithms like deconvolution, gradient removal, etc. (e.g. it is not universally reversible by simple color space conversion, nor does dcraw, for example, apply this).

 

A great "little" slide deck by M.S. Brown from the National University of Singapore that brings it all together is here. It is so incredibly comprehensive, yet simple that, if I ever would have the pleasure of meeting mister Brown, his drinks will be on me. smile.gif

 

Clear skies!



#86 sharkmelley

sharkmelley

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 7,435
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 12 April 2021 - 06:44 AM

Hi Ivo,

 

I think a colour managed workflow is a useful and interesting goal though we all recognise it has limitations.

 

Regarding the sequence:

 

Camera space RAW RGB -> white balance -> XYZ -> camera matrix -> corrected XYZ -> target color space

 

You may be right that this is the sequence of steps in the DCRaw implementation but it's potentially confusing when written down like that.  The whole point of the colour matrix is to transform from the colour space of the camera to XYZ:

 

Camera space RAW RGB -> white balance -> camera matrix ->  XYZ -> target color space

 

There are of course other matrices available.  The one in the equation above is known as the forward matrix but the ones on the DXO site bypass XYZ entirely:

 

Camera space RAW RGB -> white balance ->  DXO camera matrix ->  sRGB (linear)

 

Mark


  • Ivo Jager likes this

#87 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 572
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 12 April 2021 - 08:02 AM

It's potentially confusing when written down like that.  The whole point of the colour matrix is to transform from the colour space of the camera

You're right of course; simplified like that, it is more elegant and easier to understand. waytogo.gif


  • sharkmelley likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics