Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Pix color puzzlement

  • Please log in to reply
15 replies to this topic

#1 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 20 October 2020 - 09:56 AM

Yeah, I'm one of those guys trying to learn PixInsight, pretty much from scratch (meaning I have no previous PS skills at all).

I have for some months now been getting images using the nearly automatic Star Tools, sometimes with decent results (not always).  I felt like I needed to understand more about the tools of post processing so I am putting in some tedious effort to learn PI.

 

I used the same M33 data to produce an image in both ST and PI (using the ASI533 OSC camera).  Well, to me, the PI image is perhaps a bit less noisy.  That's a very good thing!  But I think, hands down, anyone would say that ST got the colors and PI left them behind someplace.

I followed all the beginner steps.  I used the Curves to brighten/darken and again for "Saturation".  But when I would pull the curve for saturation, pretty much nothing happened.  I know that ace processors get beautiful color with PI.  Even if I knew what I did wrong, 

perhaps I lack the skill to make it happen.

 

But does anyone have an opinion as to where this novice missed the color boat?  I would hardly be surprised if, with the vast complexity of PI, there would be different opinions from any user.  Regardless, I will keep studying and hopefully at some point produce a better outcome.

Attached Thumbnails

  • M-33 Pix.jpg
  • ST M33.jpg

Edited by arrowspace90, 20 October 2020 - 10:05 AM.


#2 H-Alfa

H-Alfa

    Viking 1

  • -----
  • Posts: 738
  • Joined: 21 Sep 2006
  • Loc: Spain

Posted 20 October 2020 - 10:18 AM

Can you tell us your step by step workflow for the PI version? It would help to analize the result.

My first bet would be that there's too much luminance.

Enviado desde mi Mi 9 Lite mediante Tapatalk

Edited by H-Alfa, 20 October 2020 - 10:19 AM.

  • arrowspace90 likes this

#3 sharkmelley

sharkmelley

    Soyuz

  • *****
  • Posts: 3,874
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 20 October 2020 - 10:46 AM

StarTools applies a colour preserving stretch to the data i.e. a stretch that does not dilute the colour in the way that the Curves process does.

 

The simplest similar process in PixInsight is ArcsinhStretch:

https://pixinsight.c...inhStretch.html

 

Mark


  • arrowspace90 likes this

#4 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 20 October 2020 - 10:54 AM

StarTools applies a colour preserving stretch to the data i.e. a stretch that does not dilute the colour in the way that the Curves process does.

 

The simplest similar process in PixInsight is ArcsinhStretch:

https://pixinsight.c...inhStretch.html

 

Mark

There's so much I don't know at all.  My absolute beginner tutorials didn't mention that one.  Thank you.  
And for course for the time being, I will continue to process (ha, practice in the case of PI) with both softwares hoping to eventually get up to snuff.  I do also think I got my PI image too bright, I could probably fix that one.  You're supposed to use all these small previews, but they at times kept me from seeing the Big Picture.



#5 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 20 October 2020 - 11:06 AM

Can you tell us your step by step workflow for the PI version? It would help to analize the result.

My first bet would be that there's too much luminance.

Enviado desde mi Mi 9 Lite mediante Tapatalk

Well, looking over my tutorial notes...

I stacked in APP, which I understand a lot of people do.  And from that point:

Crop

ABE/DBE

Color Calibration

Histogram Transformation

Multiscale Linear Transformation

Histogram Trans (again)

HDR

Local Histogram Equalization

Morphological Transformation Tool

Curves

Hist Trans again

 

Remember these are mostly greek to me, I might have left one out here, looking at my notes.



#6 H-Alfa

H-Alfa

    Viking 1

  • -----
  • Posts: 738
  • Joined: 21 Sep 2006
  • Loc: Spain

Posted 20 October 2020 - 12:11 PM

Ok. From this processes, LHE, HDR and Curves (tweaking too much L) can be harmful for saturation. I suggest you take a look to them.

I suggest you to use projects instead of working "only" with images. A project saves all the processing history included on the image so you can go back, tweak, reapply...

Enviado desde mi Mi 9 Lite mediante Tapatalk

#7 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 20 October 2020 - 02:09 PM

Ok. From this processes, LHE, HDR and Curves (tweaking too much L) can be harmful for saturation. I suggest you take a look to them.

I suggest you to use projects instead of working "only" with images. A project saves all the processing history included on the image so you can go back, tweak, reapply...

Enviado desde mi Mi 9 Lite mediante Tapatalk

This is good to know!  Unfortunately for the moment, that work flow is all I know.  I had to start at rock bottom basic to begin to be familiar with the software.  I am now going to go back and re-try the Adam Block tutorials, having a slight sense of the PI framework.  Hopefully I will develop better workflows (I don't know how to work with projects or that cool sounding Arcsinhstretch).  PI can be quite dizzying to the unitiatiated.



#8 Stelios

Stelios

    Voyager 1

  • *****
  • Moderators
  • Posts: 10,449
  • Joined: 04 Oct 2003
  • Loc: West Hills, CA

Posted 20 October 2020 - 06:22 PM

One thing for dead sure, is you should try the Photometric Color Calibration (PCC) rather than CC. So much simpler. No BN (background neutralization) needed.

 

A key lesson from Adam is that if your Lum has areas over 0.8 in brightness, those areas will *not* accept color from the Chrominance. Using the right HDRMT (check his lessons on that, they are *excellent*) will help you smack Lum down. You can always enhance it after LRGB combine by ET (exponential transformation) not to mention CS (color saturation). 



#9 sharkmelley

sharkmelley

    Soyuz

  • *****
  • Posts: 3,874
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 20 October 2020 - 10:35 PM

One thing for dead sure, is you should try the Photometric Color Calibration (PCC) rather than CC. So much simpler. No BN (background neutralization) needed.

 

A key lesson from Adam is that if your Lum has areas over 0.8 in brightness, those areas will *not* accept color from the Chrominance. Using the right HDRMT (check his lessons on that, they are *excellent*) will help you smack Lum down. You can always enhance it after LRGB combine by ET (exponential transformation) not to mention CS (color saturation). 

LRGB combine?  You realise the OP is using a OSC camera?

 

Mark 



#10 ngc1535

ngc1535

    Vendor - Caelum Observatory

  • -----
  • Vendors
  • Posts: 68
  • Joined: 18 Sep 2008

Posted 21 October 2020 - 12:44 AM

LRGB combine?  You realise the OP is using a OSC camera?

 

Mark 

That seems like unfair snark Mark.

 

As you know PixInsight processes operate on the Lightness/Luminance channel even though the image is in its combined RGB(L) form. 

The generalization that Stelios mentioned is still good. 

Other ways an image can lose saturation is through a process that effectively equalizes the brightness of the color channels (which is why, for example, you might want to only apply such a process to the Lightness instead of all of the color channels). 

 

Thus, some image processors choose to extract a separate luminance image from the RGB and work on it independently (sometimes referred as a "synthetic luminance"). This offers some benefits for ease of processing and later is recombined with the RGB image through LRGBCombination. 

 

The OP did not specifically say what method was being used (but based on the above, it is likely this was not done). There are certainly processes listed above that would make the color hard to maintain- especially if not applied to only the lightness of the image. One thing the OP did not include in the list- which is also a very likely reason the color is lost- is the method of noise reduction. To my eye, and of course everyone takes *great* joy in pointing out my mistakes as well!, it looks like the OP employed something like TGVDenoise- which if not done properly will certainly desaturate the image on top of everything else.

 

-adam



#11 sharkmelley

sharkmelley

    Soyuz

  • *****
  • Posts: 3,874
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 21 October 2020 - 01:10 AM

That seems like unfair snark Mark.

No snark intended, either to Stelios or yourself.  Sorry if it came across that way.

 

It's just that the suggestion to use LRGB combination might be confusing to a user whose workflow (see above) is not explicitly separating out luminance.

 

Mark



#12 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 380
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 21 October 2020 - 07:25 AM

In traditional applications (e.g. not ST), you will want to do your color balancing on your linear data. E.g. before you stretch (visual stretch excepted of course).

You can process (stretch) your data afterwards. This will squash/stretch the coloring along with it, which, depending on your preferences, may be objectionable.

 

The PI equivalent of a typical ST workflow,  is actually two separate workflows; one for your luminance data, and one for your color data.

In PI you would combine the two when you're done processing both; color from the one, detail from the other.

 

StarTools applies a colour preserving stretch to the data i.e. a stretch that does not dilute the colour in the way that the Curves process does.

 

The simplest similar process in PixInsight is ArcsinhStretch:

https://pixinsight.c...inhStretch.html

 

Mark

StarTools does not perform any color preserving stretches (it doesn't need to); luminance and color are processed completely separately. Even if you choose not to process color and luminance separately, color is still forward propagated from the linear data - it does not use the stretched/squashed RGB values from any other processing steps you may have applied to the image.

 

@arrowspace90, it sounds like you might find this long-running PI<->ST translation table useful, which lists some equivalent processes and/or how things in ST work differently to PI and vice-versa.


  • arrowspace90 and sharkmelley like this

#13 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 21 October 2020 - 09:56 AM

In traditional applications (e.g. not ST), you will want to do your color balancing on your linear data. E.g. before you stretch (visual stretch excepted of course).

You can process (stretch) your data afterwards. This will squash/stretch the coloring along with it, which, depending on your preferences, may be objectionable.

 

The PI equivalent of a typical ST workflow,  is actually two separate workflows; one for your luminance data, and one for your color data.

In PI you would combine the two when you're done processing both; color from the one, detail from the other.

 

StarTools does not perform any color preserving stretches (it doesn't need to); luminance and color are processed completely separately. Even if you choose not to process color and luminance separately, color is still forward propagated from the linear data - it does not use the stretched/squashed RGB values from any other processing steps you may have applied to the image.

 

@arrowspace90, it sounds like you might find this long-running PI<->ST translation table useful, which lists some equivalent processes and/or how things in ST work differently to PI and vice-versa.

Mr. Jager, you are one of the smartest people I know of in the AP arena.  I only decided to learn PI after you said people should know its principles of processing.  I was very impressed by the color that ST pulled out of its hat for my M33.  I will obviously continue to use it and I will definitely check out the provided link.  If I knew what I was doing, I would be dangerous!  Thank you for commenting, ha I'm not worthy!



#14 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 21 October 2020 - 10:00 AM

One thing for dead sure, is you should try the Photometric Color Calibration (PCC) rather than CC. So much simpler. No BN (background neutralization) needed.

 

A key lesson from Adam is that if your Lum has areas over 0.8 in brightness, those areas will *not* accept color from the Chrominance. Using the right HDRMT (check his lessons on that, they are *excellent*) will help you smack Lum down. You can always enhance it after LRGB combine by ET (exponential transformation) not to mention CS (color saturation). 

Yes sir, as a matter of fact, I did that, though I didn't say it on the work flow.  The beginner tutorial showed both ways, and I tried both ways.  The photometric method takes some of my ability to make mistakes out of the process!



#15 Ivo Jager

Ivo Jager

    Vendor ( Star Tools )

  • *****
  • Vendors
  • Posts: 380
  • Joined: 19 Mar 2011
  • Loc: Melbourne, Australia

Posted 21 October 2020 - 07:56 PM

Mr. Jager, you are one of the smartest people I know of in the AP arena.

The "smarts" are quite specific to astronomical signal processing; as the subject moves further and further away from a computer with compiler, the smarts drop precipitously. lol.gif

 

 

I only decided to learn PI after you said people should know its principles of processing.

Absolutely. It's a fantastic way to get hands-on with the naked, basic algorithms that underpin ST without the added "complexity" (from a signal processing point of view, not from a interface/workflow point of view!) of Tracking back-and-forth propagating signal.

 

One important note however, when delving into signal processing in earnest, you will learn most if you use the cleanest possible datasets you can find. This way, you eliminate many variables that may (or may not) influence how an algorithm or module works in isolation.

 

I'm worried that you may not get the most out of your PI trial with your current dataset. For example, something like deconvolution will - depending on its sophistication - fail miserably on noisy datasets.

 

 

 

If I knew what I was doing, I would be dangerous!

The more you can make this exercise/evaluation about what is fundamentally happening to your signal the better.

E.g. if you are not yet across what a Point Spread Function is, or what happens when you non-linearly stretch your signal, or why color balancing is not appropriate on stretched data, then I'm worried you will not get too much out of your 45-day trial for the purpose of learning about those important aspects. You'd just be learning how to use new software with new modules with new names, used in new (probably longer) workflows. The fundamental teachings of what these modules do to your signal at what point in time and how they affect other modules at subsequent points in time will be lost. And that would be a great shame!


Edited by Ivo Jager, 21 October 2020 - 07:57 PM.


#16 arrowspace90

arrowspace90

    Viking 1

  • -----
  • topic starter
  • Posts: 606
  • Joined: 15 May 2009
  • Loc: United States

Posted 22 October 2020 - 08:21 AM

“I'm worried that you may not get the most out of your PI trial with your current dataset. For example, something like deconvolution will - depending on its sophistication - fail miserably on noisy datasets.”

 

Well, ya got me there.  I live in a Bortle 6 1/2 suburb, and Mt Olympus is a long drive with the equipment.

Actually, the APP quality scores on the M33 frames were in the 600/700 range, something I unfortunately rarely see.  400/300s are more common.  It’s also hot and humid in the summer/fall.  Conditions many of us are stuck with.

And, c’est la vie, I do now own PI.

Im becoming accustomed to the the bolder rolling back down the mountain every day, sometimes over me.  So far I start pushing it back up.

 

Later:  And Mr. Jagger doesn't mention flats in conjunction with noise, but I can safely assume this is typically the first thing people point to for problems in clean data.

Well, if I posted a screen shot of my histogram in taking the LED panel/tee shirt flats, the peak of the light curve would be just a hair to the left of the center of the range between black point and white point.  It comes nowhere near clipping either end.  I read that this should work.


Edited by arrowspace90, 22 October 2020 - 09:36 AM.



CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics