Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

A question about setting the grey/midpoint using the "Levels" tool

  • Please log in to reply
6 replies to this topic

#1 descott12

descott12

    Apollo

  • -----
  • topic starter
  • Posts: 1248
  • Joined: 28 Sep 2018
  • Loc: Charlotte, NC

Posted 19 June 2019 - 07:43 PM

Hello all,
I am pretty new to image processing and I am using the Levels tool in Gimp. I understand what it means to set the black or white level, but it is not completely clear what is actually happening to the pixel values when you change the grey/midpoint value. The values range from 10.0 on the left to 1.0 in the center to 0.1 all the way to the right. Moving to the left clearly brightens and to the right it darkens.  But my question is what actually happens to a pixel with an RGB value of 200,100,150, for example, when you change the midpoint?

 

Thanks in advance to anyone that can explain this in terms of the mathematics.

 



#2 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 23803
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 19 June 2019 - 11:16 PM

I cannot tell you the exact formula used by Gimp, but "Levels" are usually some kind of midtone transfer function (MTF). In PixInsight, HistogramTransformation is basically the levels tool. It's formula is as so:

 

            --

            | 0      ; x = 0

            | 0.5    ; x = m

MTF(x,m) = -| 1      ; x = 1

            |

            | R(x,m) ; otherwise

            --

 

R(x,m) = [(m - 1)*x)] / [(2*m - 1) * x - m] 

 

 

Gimps formula is most likely not the same here, but it is probably something similar. By default in PixInsight, the midpoint, m = 0.5. If you shift the midpoint to the left, say to 0.25, then a pixel with the value 200(BTW, all pixel values are floats with the range of 0.0 - 1.0 in PixInsight...if we assume 200 is on a 16-bit scale that would be 0.00305) would run through the function R(x,m):

 

R(x,m) = [(0.25 - 1)*0.00305)] / [(2*0.25 - 1)*0.00305- 0.25]

       = [-0.75*0.00305] / [-0.5*0.00305 - 0.25]

       = -0.0022875 / -0.251525

       = 0.0091

 

So your original value of 0.00305 (200 DN out of 65535 DN) grew to 0.0091. On a 16-bit scale, that is 596 DN. So you almost tripled the value of this 200 DN pixel (which, in your example, would just be the red channel.) Now, lets say you shift the midpoint down to 0.01:

 

R(x,m) = [(0.011 - 1)*0.00305)] / [(2*0.01 - 1)*0.00305 - 0.01]

       = [-0.99*0.00305] / [-0.98*0.00305 - 0.01]

       = -0.0030195 / -0.012989

       = 0.2325

 

This is a much more significant change. Your 200 DN red value has now increased to 15237 DN.

 

Again, this is how PixInsight does it. I only know this because it is documented (which is not all that common for other software). :p

 

https://pixinsight.c...sformation.html


  • t_image likes this

#3 t_image

t_image

    Gemini

  • -----
  • Posts: 3287
  • Joined: 22 Jul 2015

Posted 19 June 2019 - 11:21 PM

Welcome to processing!

I don't own Gimp but I'm a tiny bit aware of color tools.

My first question for you is have you tried to experiment and see what happens to the values numerically?

 

Most color tools in different programs are pretty much black box as far as the math operations specifically.

So you are slightly asking someone else to do the specific work for you,

and as far as editing programs I'd think those that would have done such intense color investigations (like the extensive work of Mark Shelley here on CN) are using software that had a pricetag like the industry standard Adobe Photoshop.

 

You may benefit from your personal experimenting.

Photoshop has a color picker and info box that allows you to hover over a color and see the RGB values in the bit depth of your choice.

I'd be surprised if Gimp didn't have such tool.

 

I created a set of RGB/W chips that are labeled 0-255 and cover the full range of values here:

https://imgur.com/a/e5C4X

 

this allows you to easily hover over and see the initial (labeled) and then test what changes have done......

 

You can download and practice color correction/grading on and see how your tools work.

 

Notice if you hover over the labeled chips they initially will have the labeled value --red 200 will be 200,0,0. White 200 will be 200,200,200, blue 200 will be 0,0,200,etc....

 

 

Note in Gimp you should have access to a whole host of color tools in an image editor where you can also isolate changes to specific channels or apply selection masks or color ranges to isolate specific bits, etc.....

In your levels adjust you may have a drop-down that your can toggle between RGB and each color channel separately, and then there may be a checkbox about mixing/preserving luminosity so all the channels are affected together.

 

Unfortunately only better video editing software has the benefit of waveform, RGB parade, and vectorscopes that give you precise graphical feedback of the changes without having to trust your eyes or the calibration of your display or the RGB info tool....At least the histogram you do have gives you a slight idea.....

 

You can easily create for yourself color chips of say an RGB value of 200,100,150 and then apply the change and then hover over and look at the value and see what happened.....

 

You will be better for it by experimenting.

 

However generally:

  • the blackpoint or shadows or lift moved to black: will pull more pixels down to 0 lum ('brightness'-not the best technical term)..This means you clip and lose information in the shadows.  Raising this will raise the hidden visible noise in the image, but calibration frames and stacking and hotpixel removal should eliminate most of this....You might want the faint details that may live down here....
  • Moving the whitepoint or highlights or brights to white will move more pixels towards 255 value (or 1023 in 16bits per color)....This means if you do max the pixels out you clip the highlights and will lose detail in the highlights and will also lose all color information in stars (since RGB all at 255=pure white).....
  • Moving the midpoint or midtones or Gamma up or down will either raise some dark pixels into the midtones and/or move midtones pixels into the highlights, moved the other way it will move some highlights pixels down to midtones or move midtone pixels down to shadows.... The 3 points work off each other so the range is based on the bookend points

Note coming from linear data and stacked/processed, your image will have more data than would possibly be able to be displayed on current technology,

nor be contained in most image files in a typical color gamut (sRGB/Adobe RGB).

Your sensor captures a huge range of darks to brights. So in color correction, especially in levels adjust,

you are deciding what information should be spread out where.

You can either:

  • crush pixels down to black,
  • squeeze them into a more limited range of grays/mid-saturated colors,
  • or push them to max out at pure white.

it's a process of trade-offs and compromises.....but can also work to the editor's advantage.

Best to crush noise to the black floor, keep some fun faint stuff off the black floor,

spread the grays as wide as possible depending on your subject, (and note you don't have to equally spread the midtones out),

and keep the highlights from clipping to pure white unless you want a particular region showing that way (without any color).....

 

Note there is complexity to how all this works,

since the math values are weird.

Due to the particular response curve of the human eye for the range of bright/mids/shadows, and color, it isn't a linear line.

The 'just-noticeable difference' between values in the ranges are different, although I believe the typical reported RGB numbers are after the response calibration is applied for JND....

Color science is a deep rabbit hole........

 

The colors on screen are based on the RGB 'brightness' value ratios. (ie 0-255)....So the levels adjust RGB and lum mix style will adjust the brightness and change colors to brighter/darker that way.

Working in separate channels will offset the individual R or G or B values separately (if lum mix is unchecked?), which will change the balance and change the RGB mix and thus change some colors as well. Some tools allow you to pull one color up/down and it will have a blackbox effect on the other two primaries....

Levels works with brightness.

HSL is hue, saturation, brightness.  H-Hue is the spectrum color point adjust (is it purple or orange or yellow,etc.), Saturation is how rich the mix of the spectrum color is(dull=more grayish v. vibrant), and L is Luminance or brightness.

Say you have a green cast (nothing green in space except comets[unless doing Hubble palette w/NB), then you pull the green down in the mix while keeping reds and blues up.....Most times you can add the offset color of the primary to remove it, but there are better advance tools to do things like this as well....

 

Hope this helps!



#4 Jon Rista

Jon Rista

    ISS

  • *****
  • Posts: 23803
  • Joined: 10 Jan 2014
  • Loc: Colorado

Posted 19 June 2019 - 11:32 PM

Oh, a couple other examples. The change in value when adjusting the midpoint is non-linear. A pixel value, x, close to 1.0 will not change much if you shift the midpoint, m, down (to the left):

 

R(x,m) = [(0.1 - 1)*0.95)] / [(2*0.1 - 1)*0.95 - 0.1]

       = [-0.9*0.95] / [-0.8*0.95 - 0.1]

       = -0.885 / -0.86

       = 1.03

 

So a very bright value, 62258 DN, will increase to 67501 DN. A relatively small increase of ~5234 DN compared to a value of 200 DN:

 

R(x,m) = [(0.1 - 1)*0.00305)] / [(2*0.1 - 1)*0.00305 - 0.1]

       = [-0.9*0.00305] / [-0.8*0.00305 - 0.1]

       = -0.002745 / -0.10244

       = 0.268

 

Which is an increase of 17363 DN. Hence how when you shift the midpoint down (left), you compress values, x, above the midpoint, m, more and more the farther above m they are. Such is how you transform a linear image into a non-linear image. Note that shifting only black or white point will keep the signal linear. You may clip signal if you shift the white and black points in too much, but they will keep the distribution of the signal between those two points linear (although the signal will still be stretched). It takes shifting the midpoint to make the signal non-linear with the levels tool. 


Edited by Jon Rista, 19 June 2019 - 11:36 PM.


#5 t-ara-fan

t-ara-fan

    Viking 1

  • -----
  • Posts: 758
  • Joined: 20 Sep 2017
  • Loc: 50° 13' N

Posted 20 June 2019 - 12:08 AM

Somewhat off topic, have you ever tried  stretching/processing with rnc-color-stretch?  

 

It is pretty amazing, taking your stacked image and stretching it while maintaining natural color.  I used it for this pic, one step processing after a few minutes of trials to decide how far to stretch.  I like it because it takes a scientific approach to accurate color, rather than just moving sliders around until "it looks nice".



#6 descott12

descott12

    Apollo

  • -----
  • topic starter
  • Posts: 1248
  • Joined: 28 Sep 2018
  • Loc: Charlotte, NC

Posted 20 June 2019 - 06:05 AM

I cannot tell you the exact formula used by Gimp, but "Levels" are usually some kind of midtone transfer function (MTF). In PixInsight, HistogramTransformation is basically the levels tool. It's formula is as so:

 

            --

            | 0      ; x = 0

            | 0.5    ; x = m

MTF(x,m) = -| 1      ; x = 1

            |

            | R(x,m) ; otherwise

            --

 

R(x,m) = [(m - 1)*x)] / [(2*m - 1) * x - m] 

 

.....

Jon, you are awesome thanks alot! This is exactly what I needed and I wasn't sure if it existed anywhere.



#7 descott12

descott12

    Apollo

  • -----
  • topic starter
  • Posts: 1248
  • Joined: 28 Sep 2018
  • Loc: Charlotte, NC

Posted 20 June 2019 - 06:07 AM

Welcome to processing!

I don't own Gimp but I'm a tiny bit aware of color tools.

My first question for you is have you tried to experiment and see what happens to the values numerically?

 

......

 

Hope this helps!

Yes, this helps a ton! Yes, I started to experiment in Gimp to see how a specific pixel value would change and realized quickly that it was not a straightforward transformation.  Thanks again. 


Edited by descott12, 20 June 2019 - 06:08 AM.



CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics