Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Why our color is not "realistic" Part 1 - Theory

  • Please log in to reply
52 replies to this topic

#1 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 11:30 AM

Beginning imagers often wonder about whether color in our astrophotos is "realistic".  The answer is that it's not, for any number of reasons.  A fundamental one is that our eyes don't see color at all well in dim settings.  Getting closer to the target in a space ship doesn't fix that, the overall brightness may increase, but the target gets larger, and surface brightness does not go up as fast.  Our images are _not_ like getting close in a spaceship.

 

This will be unavoidably long and technical, and it only scratches the surface.  Do read the first 6 (short) paragraphs, down to "here we go".  After that, when your eyes (those imperfect tools) start to glaze over, just skip down to the last 8 (even shorter) paragraphs, beginning with "welcome back".

 

Definitions.  "Better" color and images are ones that look better to you and/or others.  "Natural" color is color you and/or others find matches your mental concept of what astro color _should_ look like.  That's a subjective thing, color in our astrophotography is subjective in all kinds of ways, some covered here, some not.  "Clever" means a method that produces a decent result, although not a perfect one.

 

There are three main reasons our color can never be perfectly "realistic".  The first is maybe the biggy, the filters we use to sort out color destroy a very large amount of color information.  How much depends on the filters, we'll spend some time on that.  But it's always most of it.  The methods we use to try and restore it are inherently flawed, when you have so much missing information, trying to reconstruct the reality simply has to be an approximation.  What's gone is gone.  Restoring it, in some ways, resembles building a perpetual motion machine.  Entropy is involved.

 

The second is that both our eyes and the display methods we use are flawed.  And they get a lot of use in the third area.

 

Third, the processing tools we use to try (often successfully) and bridge the gaps caused by the first two things are always imprecise, they could not be otherwise.  In adjusting the processing we use the subjective and variable eyes and the imprecise display devices.

 

Here we go.  Filters.  Best case.  A mono camera and good RGB interference filters.  A vast amount of information is utterly destroyed.  Take a deep red signal, and one that is not so "red".  Say, radiation at 690 nm and at 600nm.  Pass that radiation through a "red" filter and they both produce _exactly_ the same electrons.  Tremendous information loss.

 

Worst case.  Narrowband filters, like the set Ha, O(III) and S(II).  Almost the entire spectrum has been lost.  Both Ha and S(II) are red, which causes problems.  As a result, imagers often totally give up on any attempt at natural color.  Instead they use schemes like the Hubble palette to make images that are both pretty and tell a decent story about structure.  They appropriately label those "false color".

 

In between.   Doing LRGB imaging.  This is a clever trick that uses the imperfect nature of our eyes.  We see detail far better in black and white than in color.  By spending some time taking black and white subs, we can produce better images in less time.  That doesn't come for free.  Adding L to RGB data inevitably dilutes color.  We can process it to restore it decently.  But never perfectly, because of the _very_ non-linear nature of our data.  This is an interesting thread from the PI forum on that.  A consensus (including Juan Conjeros, the guy behind PixInsight who knows a great deal about color) is that, particularly in light polluted skies, it produces better images faster, at the cost of less natural color.  An interesting minor point is that he supports binning color (a technique that goes in and out of fashion), on the basis that once you accept the tradeoff, binning color doesn't make it much worse.  So, once you've decided to do LRGB, you might as well save even more imaging time by binning color.  I tend to agree.

 

Using one shot color through a standard Bayer matrix deserves special attention with regard to destruction of information.  The underlying fact is that the Bayer matrix was, and is, designed to sell more cameras by both being cheap, and producing pictures that people subjectively prefer.  Do those things well, and you sell more cameras.

 

Reality, dearie, has nothing to do with it.  <grin>

 

Two things that are no problem for doing that with terrestrial pictures, are significantly bad for generating good color data (to the extent that any RGB filter can) in astrophotography.  50% of the pixels are devoted to green.  Given the very low signal to noise ratio of our data, that's a problem.  Tools to reduce excess green are _very_ commonly used in processing.  That green is not real, it's an artifact emblematic of our imprecision.  Electrons generated there are wasted.

 

Needless to say, the green rreduction tools are also imprecise.  PI has a number of parameters that can be adjusted in SCNR, my PI course from the incomparable Vicent Peris, discussed why you might prefer to use one value or another. 

 

While any camera can produce some excess green, DSLRs are on the "worse" end.  And the colored glass filter is _way_ sloppy in parsing color.  Now, not only has information about various shades of red been lost, RGB overlap to a degree that blurs even the separation between those.  Going to mono plus RGB often surprises people when they find color processing is now easier.

 

Still in between, but tending toward narrowband.  The very popular duo and triband filters recently released.  They pass little information, but more than narrowband.  Going more toward LRGB in terms of color data.  Broadband light pollution filters.  A mixed bag.  The more powerful they are in rejecting light pollution, the more color information they destroy.   The CLS is a good example, posts about color issues with a CLS are common.  The less aggressive IDAS is often preferred by imagers who use broadband LP filters, for that reason.

 

I just try to stay away from broadband LP filters altogether, because I don't like the color issues, and I think other alternatives for dealing with light pollution are superior.  It's not an outlier position.

 

The process by which we separate OSC data from a Bayer matrix filter into RGB channels is called Debayering.  I imagine the data is quite relieved to be Debayered <smile>.  But you don't want to do that simply, which would reduce resolution because we'd have specific red pixels, say, with gaps between them.  Instead there are a variety of interpolation schemes for assigning, say, a red value to a green pixel.  The clever trick works well for resolution, and passably well for color, (the eyes are helpful there) although it's imperfect.  The best debayering technique is yet another debate.

 

So, we have darn imperfect information about the color of the light actually emitted by the target.  The above sounds pretty bad.  How can we possibly cope?

 

Generally a clever trick is used.   All the various colors can be represented to our eyes as some numerical combination of R, G, and B.  The way that's done is known as a "color space".  There are a number of those, and people hotly debate which are more "realistic".  A "white reference" is defined (more about the variability there later).  A calculation is made as to what factors to apply to the R, G, and B levels to make the white reference white.  Then those factors are applied throughout the image.  We assume that will make everything realistic.

 

That assumption is not terrible, but it's inherently flawed.  Various things mess up the process.  The factors will differ according to the specific circumstances in various parts of the image.   How much is a truly impossible calculation, so we can't really "fix" this.

 

Here's what PixInsight has to say about color calibration.  (I trust people agree they know something about this stuff?)

 

"Our approach originates from the fact that —in our opinion— the concept of real color makes no sense in deep-sky astrophotography. Real color doesn't exist in the deep sky because, on one hand, the objects represented in a deep-sky image are far beyond the capabilities of the human vision system, and on the other hand, the physical nature, properties and conditions of the deep-sky objects are very different from those of the subjects that can be acquired under normal daylight conditions."

 

And:

 

"The image(s) must be accurately calibrated <before color calibration>. In particular, illumination must be uniform for the whole corrected image and, if different images are used to define the background and/or white references, those images must also have uniform illumination throughout the entire field. This means that flat fielding must be correctly applied as part of the image calibration process, and any residual additive gradients must also be removed before attempting to perform a valid color calibration."

 

So, in order to do a "valid" color calibration you have to do perfect flats and perfect gradient reduction.  Volunteers?  <smile>

 

Side note.  White balance in terrestrial work is a variation on the theme.  Each white balance in the camera is a preprogrammed set of RGB ratios that is judged (approximately), to give proper color representation when the subject is lit by various forms of light, such as fluorescent tubes.  The reason white balance is not terribly useful (a notable exception will be discussed later) is that there are better ways to color calibrate (ie set the ratios) in our work.

 

Choosing a white reference involves some subjectivity (more on that in a bit.)  In processing we sometimes change the color space we use (or our processing program does it behind the scenes), and those transformations introduce uncertainty in color.

 

Again, all this sounds bad.  The reason it's not that bad is that we can (and do) tweak the tools and their parameters in what we do.  The tweaks are not mathematically based on science and reality, they are based on making the images look better and more natural to our imperfect, and individually variable eyes, working with an imperfect monitor.  So, the process works.  Or at least it can work, skill (dare I say artistic skill? <smile>) counts for a lot.

 

The result is that you can make nice images with any of this equipment and any of these techniques.

 

A good example of tweaking the parameters is how we choose a white reference.  These vary widely.  PixInsight, a program intended to be as scientifically rigorous as possible, lists many of them in the well known PhotometricColorCalibration process.  Which one to use is a subjective choice, often debated by the very best imagers.

 

All this is intended to make our images congenial to our output devices, and our eyes.  Monitors vary wildly in their ability to reproduce color spaces.  The buzzword is "gamut".  Ink jet printers often use a CMYK color space, which is radically different from RGB.  Making printed images look good involves tweaking the parameters in processing a whole lot.

 

And it all makes a hash of "reality".  (Not that we could ever claim to be "real" after destroying so much information).  The more you learn about this (this post just skims the surface, there's a "Color in Astrophotography" book waiting to be written), the more you realize how futile it is to chase reality.

 

At some point people will understandably ask.  Don't professional astronomers use color to get information about objects?  They do indeed, the field is called spectroscopy.  I've been a spectroscopist (terrestrial), my PhD thesis is on the subject.  It's informational to know what spectroscopists _don't_ do.  They don't use filters that inevitably destroy large amounts of information.  They use a variable spectrometer that parses color into quite small slices, often the narrower the bandpass of the spectrometer, the better.  They hoard information, instead of destroying it.  They don't use their eyes, or even make pictures, they just use the numeric values of the data.  They don't do the things that make the concept of "reality" in our images dubious.

 

Three sidebars about DSLRs (and OSC cameras in general) deserve special note.  sharkmelley has discussed using a 3X3 matrix to better calibrate DSLR data.  While I don't consider it "scientific" I do think it's one way to make DSLR images better and more natural.  I don't know of a source for the matrices for astro caneras, but maybe the ones for DSLRs using the same chip would work.  Jerry Lodriguss sometimes images with a broadband ligh pollution filter on top of a Bayer matrix filter.  I shudder at that thought, to me it's like having a headache _and_ an upset stomach.  <smile>  But he has utilized a "custom white balance" procedure, that is not hard to do, and can give quite good and natural images in that quite hostile environment.  I used to look at it with dubious eyes, now (having more knowledge about color) I think he has a point.

 

Thing is, I doubt there are 1% of people here who've even tried either technique.

 

What people _are_ doing is putting Dual or Tri band filters on top of Bayer matrix filters.  These are becoming quite popular (although it should be noted that they only really work on emission nebulae).  But they make a lot of sense.  The filters have _very_ restrictive bandpass.  Not narrowband restrictive, but close.  If you block off that much of the spectrum, the issues of Bayer matrix filters become a whole lot less important.  Stacking filters is more inefficient and requires more total imaging time, but the resulting images can be quite nice.

 

Welcome back.  I'll repeat.  The more you learn about this, the more you realize how futile it is to chase reality.  Here's two very relevant quotes (paraphrased) from the superb "Lessons from the Masters:  Current <2013> concepts in astronomical image processing".

 

"Color saturation is an invention of our images.  So saturate however much you like."

 

"When I started out I color processed obsessively.  I calibrated carefully to G2v stars <as a white balance>.  I was nervous, certain that one day the Color Police would break down my door, and confiscate the equipment.  These days I do what I like."

 

The phrase "pretty pictures" is often used in a derogatory sense.  I don't see it that way at all.  Color processing is always imprecise, why not make the results pretty?  Or better.  Or more natural.   Whatever your priorities are. 

 

It certainly is possible to make colors so unnatural that the resulting images could be called cartoons.  I don't like cartoons.  But I note that images from people who are successfully selling them on the Internet often trend in that direction.  At the very least, saturation will be strong.  Unless you differentiate yourself, how can you sell to the general market?

 

You generally want to make better images.  It is very useful to study methods, to solicit advice.  But know that you'll always get people who like their color better than yours.  Take criticisms of your color processing (especially those citing "reality") with an attitude of empowerment.  If you agree, and want to change what you do or did, fine.  If you don't, fine.  The Color Police can (and will) criticize your images, but they're not going to confiscate your equipment.  <smile>

 

The bad news here is that our color is not realistic.  The good news is, once, you embrace that fact, you're free to do better or more natural images.  Or (oh, the horror! <smile>) to make pretty pictures.

 

Advice to beginners.  Do not tilt at the windmill of reality, or worry about it.  Better or more natural or prettier are hard enough (which to emphasize how much is a personal choice), don't waste time on the impossible.


Edited by bobzeq25, 03 December 2019 - 11:41 AM.

  • WebFoot, schmeah, H-Alfa and 11 others like this

#2 Chuck Hards

Chuck Hards

    You don't know Swift from Astrola

  • *****
  • Administrators
  • Posts: 25,817
  • Joined: 03 May 2010

Posted 03 December 2019 - 11:36 AM

I read once that if you suddenly found yourself on a planet in the middle of the Orion nebula, it would still appear rather dim and grey to the naked eye.  Covering the entire sky, but still dim and grey.  You might see some faint coloration in the denser (brighter) parts.

 

The great fallacy of time exposures and selective sensitivity.  It's not a Hubble universe out there, to the naked eye.


  • bobzeq25 likes this

#3 Madratter

Madratter

    Voyager 1

  • *****
  • Posts: 10,967
  • Joined: 14 Jan 2013

Posted 03 December 2019 - 11:49 AM

I'm already on record as agreeing with your major premise even if I might argue about some minor point.

 

Even using cameras with a Bayer matrix for terrestrial photography you get some really wacky colors. They generally do alright on skins tones because they are tuned to that. But if you do any flower photography and pay attention to the results vs the original, you get some rather interesting color shifts. LRGB filters would do no better.

 

I do have a nit about binning the color information. That is rather camera specific. For example on the KAF 8300 you will end up with rather unfortunate artifacts on bright stars any time you bin. It is because of this that I avoid it. In general the color information does change rather more slowly and you don't lose a lot of information from binning from that respect. And I'm sure that is what was really being talked about. But you do need to consider the specific chip and its foibles.



#4 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 11:51 AM

I'm already on record as agreeing with your major premise even if I might argue about some minor point.

 

Even using cameras with a Bayer matrix for terrestrial photography you get some really wacky colors. They generally do alright on skins tones because they are tuned to that. But if you do any flower photography and pay attention to the results vs the original, you get some rather interesting color shifts. LRGB filters would do no better.

 

I do have a nit about binning the color information. That is rather camera specific. For example on the KAF 8300 you will end up with rather unfortunate artifacts on bright stars any time you bin. It is because of this that I avoid it. In general the color information does change rather more slowly and you don't lose a lot of information from binning from that respect. And I'm sure that is what was really being talked about. But you do need to consider the specific chip and its foibles.

I totally agree with all of that.  I've glossed over a lot.

 

You'll get more air time in Part 2 - Practice.  <smile>


Edited by bobzeq25, 03 December 2019 - 11:51 AM.


#5 Starsareus

Starsareus

    Vostok 1

  • *****
  • Posts: 145
  • Joined: 27 Jul 2008

Posted 03 December 2019 - 12:29 PM

Well said bobzeq25. The "funny" part to me is our eyes don't actually "See" anything. They merely send electrical impulses to our "eyeless" brain to create an image. Do all brains looking at the same object "actually see" the same thing??


  • elmiko and bobzeq25 like this

#6 Madratter

Madratter

    Voyager 1

  • *****
  • Posts: 10,967
  • Joined: 14 Jan 2013

Posted 03 December 2019 - 12:36 PM

Well said bobzeq25. The "funny" part to me is our eyes don't actually "See" anything. They merely send electrical impulses to our "eyeless" brain to create an image. Do all brains looking at the same object "actually see" the same thing??

In the case of people who are color blind, clearly not.

 

In the general case, it is hard to separate what is "seen" from what is socialized. I do know that for me "school bus yellow" is very definitely an orange color, not yellow. I also know that my wife and I completely disagree on the boundaries between green and blue.



#7 BenKolt

BenKolt

    Vanguard

  • *****
  • Posts: 2,016
  • Joined: 13 Mar 2013

Posted 03 December 2019 - 01:00 PM

Bob:

 

Thank you for posting this.  I stuck with it the whole way through!  I plan to read through it again a bit more slowly this evening when I have more time, but at first pass I find myself in general agreement with you on most, if not all points.

 

My perception of astrophoto color processing has evolved through the same path you outline in your post.  I obsessed over it far too much at the beginning and insisted on obtaining "real" colors, particularly with my galaxy images.  (I remember despairing at not finding a single G2V star in the frame.  Why can't there be more of those!) But over time I realized my human eyes were incapable of perceiving that kind of bright color in those galaxies anyway.  I still over obsess on many aspects of my images, but no longer in chasing a "real" color balance.

 

I do have a nit about binning the color information. That is rather camera specific. For example on the KAF 8300 you will end up with rather unfortunate artifacts on bright stars any time you bin. It is because of this that I avoid it. In general the color information does change rather more slowly and you don't lose a lot of information from binning from that respect. And I'm sure that is what was really being talked about. But you do need to consider the specific chip and its foibles.

 

Madratter:

 

From my light polluted suburban skies I practice mostly LRGB imaging, and for the most part I am a proponent of binning the color frames in order to obtain "suitable" SNR more quickly, thus devoting more time to obtaining the L channel detail.  I'm more likely to bin 2x2 at smaller image scales (longer FL, smaller FOV), less so at larger image scales.  I've arrived at this practice mostly from the standpoint of time management and efficiency.  I have two cameras, a KAF-16200 and a KAF-8300, both usually employed at the same time on separate imaging rigs.

 

Could you please give more details on the artifacts you encounter with 2x2 binning the color frames with the KAF-8300?  It may well be that these artifacts are either not present in my images or are less obvious than the other noise issues that plague me such as light pollution, limited integration time, etc.  But I'd appreciate it if you could elaborate on this point.  Thank you!

 

Best Regards,

Ben

 

 



#8 AhBok

AhBok

    Vanguard

  • *****
  • Posts: 2,393
  • Joined: 02 Dec 2010
  • Loc: Lakeland, TN

Posted 03 December 2019 - 02:01 PM

I am old fashioned. During the day I view a panorama and take a snapshot with my OSC. The colors in the image appear the same as what my eyes see. I figure my OSC is not too encumbered by my camera’s Bayer matrix. At night, I take an image with my OSC and compare the colors with excellent photos taken years ago with film. The colors look similar and there were no filters involved. I figure My OSC colors must be close to realistic. Arguments like whether or not the colors one sees are different than others, etc., I mostly ignore, but I do like to stay abreast of my ignorance. How is my simple logic wrong? That is, why do images from film, OSC cameras and my eyes appear very similar?
  • jdupton, sharkmelley, bobzeq25 and 1 other like this

#9 Gipht

Gipht

    Vanguard

  • *****
  • Posts: 2,011
  • Joined: 12 Nov 2016
  • Loc: Prescott Valley, AZ.

Posted 03 December 2019 - 02:07 PM

With our hobby, as complicated as it is, we all evolve at  least in recognizing the complexity of what we are doing.  When a newcomer publishes a picture,  I am often amazed at how good the results are.  How could this happen, such a good result?  Don't they know how hard this is?   Is this as simple as just taking several pictures, doing some stacking, and adding some saturation?

 

In service to our cause we tinker with our equipment, split the finest hair in getting the best focus, fret over our exposure times, and work ourselves further and further into debt.  Now your telling us the colors we end up with aren't really there, and not to worry about it.  Did my blood pressure just go down?wink.gif

 

Thanks for the very informative article Bobzeq25.  We need all the help we can get.


  • bobzeq25 likes this

#10 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 02:24 PM

In the case of people who are color blind, clearly not.

Here's the thing about that.  Color blind is not a binary, yes/no thing.  There are degrees and different manifestations.


  • epdreher likes this

#11 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 02:25 PM

I am old fashioned. During the day I view a panorama and take a snapshot with my OSC. The colors in the image appear the same as what my eyes see. I figure my OSC is not too encumbered by my camera’s Bayer matrix. At night, I take an image with my OSC and compare the colors with excellent photos taken years ago with film. The colors look similar and there were no filters involved. I figure My OSC colors must be close to realistic. Arguments like whether or not the colors one sees are different than others, etc., I mostly ignore, but I do like to stay abreast of my ignorance. How is my simple logic wrong? That is, why do images from film, OSC cameras and my eyes appear very similar?

You're not wrong.  The various techniques have different pros and cons.  Understanding them can make equipment choice easier.

 

But all can do good images.



#12 klaussius

klaussius

    Vostok 1

  • -----
  • Posts: 101
  • Joined: 02 Apr 2019
  • Loc: Buenos Aires

Posted 03 December 2019 - 02:32 PM

Oh, the colors are there. It's just that we are unable to perceive them unaided, so worrying over how "realistic" they are is a bit of an oxymoron since the only thing that would match a natural look would be a dim, colorless picture.

We don't want dim, colorless pictures, so we have to accept the colors are not related to what we could possibly perceive.

 

But I don't like it when people say "all colors are false since we can't see them in dim DSOs". That's disingenious, IMHO. If we had magically efficient eyes, we might see what the camera is seeing. In fact, the camera is that magically efficient eye.

So they're real. They're approximations of the mix of wavelenghts being captured from that part of the sky. I think when people aim to be "realistic", it's with that in mind. They don't want a painting, something completely artificial, they want a photograph. They want the image to correlate to actual data representing physical photons as faithfully as possible, emitted by physical objects or processes really occurring there in space, within some constraints. And when we take artistic license, we want it to be conscious and not accidental.

 

The fact that our eyes can't see that is irrelevant. What you see in that picture with the camera's aid is there. It's a real thing you can't see, but that you can detect. And in fact you have just detected it.

 

So, think about extremes. If you get a picture that tells you all stars are white, there's something wrong with that picture. If you see a red star, it should be a red star. If not, there's something wrong with that picture. In that sense, color calibration is important. It's so easy to get things wrong there, but the exact hue of red is less important, since you're not doing photometry.

 

So, yes, it's subjective. But that doesn't mean it's not real.


  • jdupton, sharkmelley, Jon Rista and 4 others like this

#13 Chuck Hards

Chuck Hards

    You don't know Swift from Astrola

  • *****
  • Administrators
  • Posts: 25,817
  • Joined: 03 May 2010

Posted 03 December 2019 - 02:41 PM

I am sure that if we showed a photo that we considered color-faithful to an alien being who didn't evolve under a yellow sun, they would laugh at us and say the color is completely wrong.

 

Besides color-blindness, some people have four color receptor cones, not three.  It has been documented that they see colors that we can't, and the colors we agree on, they don't.

 

Color perception is relative.  The best we can do is balance for the people who are clustered close to the center of the bell curve.



#14 AhBok

AhBok

    Vanguard

  • *****
  • Posts: 2,393
  • Joined: 02 Dec 2010
  • Loc: Lakeland, TN

Posted 03 December 2019 - 03:24 PM

Well, as we all know, aliens have x-ray vision so colors are pretty much irrelevant to them!

#15 WadeH237

WadeH237

    Aurora

  • *****
  • Posts: 4,953
  • Joined: 24 Feb 2007
  • Loc: Snohomish, WA

Posted 03 December 2019 - 03:26 PM

Here's the thing about that.  Color blind is not a binary, yes/no thing.  There are degrees and different manifestations.

Not only that, but there are some people (mostly women) who have four primary color receptors, not three.  This is called tetrachromacy.


  • epdreher and bobzeq25 like this

#16 Madratter

Madratter

    Voyager 1

  • *****
  • Posts: 10,967
  • Joined: 14 Jan 2013

Posted 03 December 2019 - 03:35 PM

Madratter:

 

From my light polluted suburban skies I practice mostly LRGB imaging, and for the most part I am a proponent of binning the color frames in order to obtain "suitable" SNR more quickly, thus devoting more time to obtaining the L channel detail.  I'm more likely to bin 2x2 at smaller image scales (longer FL, smaller FOV), less so at larger image scales.  I've arrived at this practice mostly from the standpoint of time management and efficiency.  I have two cameras, a KAF-16200 and a KAF-8300, both usually employed at the same time on separate imaging rigs.

 

Could you please give more details on the artifacts you encounter with 2x2 binning the color frames with the KAF-8300?  It may well be that these artifacts are either not present in my images or are less obvious than the other noise issues that plague me such as light pollution, limited integration time, etc.  But I'd appreciate it if you could elaborate on this point.  Thank you!

 

Best Regards,

Ben

 

I don't want to hijack this thread so I started a new one.

 

https://www.cloudyni...kaf8300-sensor/



#17 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 03:59 PM

I am sure that if we showed a photo that we considered color-faithful to an alien being who didn't evolve under a yellow sun, they would laugh at us and say the color is completely wrong.

 

Besides color-blindness, some people have four color receptor cones, not three.  It has been documented that they see colors that we can't, and the colors we agree on, they don't.

 

Color perception is relative.  The best we can do is balance for the people who are clustered close to the center of the bell curve.

Aliens?  PixInsight is the answer, as usual.  PhotometricColorCalibration can be set to use a large number of star types as the white reference.  So, aliens?  No prob.



#18 sharkmelley

sharkmelley

    Mercury-Atlas

  • *****
  • Posts: 2,528
  • Joined: 19 Feb 2013

Posted 03 December 2019 - 04:21 PM

Beginning imagers often wonder about whether color in our astrophotos is "realistic". 

 

<<CUT LONG ARTICLE>>

Wow - an impressively long post summarising your thoughts.  I appreciate the work you put into it. 

 

I agree with a lot of what you said but predictably I disagree with your thoughts on DSLRs and OSC cameras.  A DSLR is a great tool for producing coloured images of the night sky.  We trust it for reproducing daytime colours so why not for night-time colours also?  I'm not going to dissect your arguments point by point but for those who are sufficiently interested to understand a bit more about the Bayer matrix filters and how they manage the trick of reproducing colour (and that 3x3 matrix) then read an earlier post of mine here:

https://www.cloudyni...sity/?p=9792791

 

Mark


  • jdupton, AhBok, Jon Rista and 3 others like this

#19 t_image

t_image

    Gemini

  • -----
  • Posts: 3,338
  • Joined: 22 Jul 2015

Posted 03 December 2019 - 07:57 PM

.........Color processing is always imprecise................

Not like this discussion has been had before.BeatingADeadHorse.gif

Nor have you lacked sharing your ideology.

Sure you post has some basic factual points.

It is also contaminated with lots of opinion, imprecision and faulty reasoning.

 

Reading up on some basics on the internet doesn't make you a subject matter expert on color science,

and certainly you quoting only sources that say what you want them to say like you oft do doesn't mean it's fact.

 

The OP again commits the same logical fallacies post-modern deconstructionists love to offer (if it's not 100% then there's no trustworthiness at all and its "all subjective.")

 

Your failure to again address (and you've never responded to) the challenge

for the argument of repeatability also demonstrates your affinity for card-stacking selective attending fallacy in reasoning.

 

Again people are free to do what they want, but it's ironic you are so passionate in convincing people there is a definitive, certainty of unreliabilityshakecane.gif .

Are you absolutely certain of this?

 

If color processing was so imprecise to the degree that you presume/want,

there would be no point of skilled photographers that do product photography and art archiving for museums.

 

There's a lot of work, effort, and tradecraft you are clearly ignorant about where color imaging does have precision and can demonstratively give

repeatable

and predictable results,

and photons coming from space don't operate with different laws of physics.

 

If you stopped at the philosophical question of "what is the realistic nature of a non-terrestrial object"

then your deconstruction could be valid,

but why not then go further and deconstruct that

we can't as well present an object's true:

  • shape
  • 3 dimensions
  • size

in an image

 

I don't know why we want to fixate on color when we are "crippled" to represent much of anything properly in so many other aspects?

How about smell or taste?

 

But those who pursue AP still make effort, whatever one's particular requirements.

 

Because maybe those who want to pursue matters in a more disciplined effort are willing to make the effort.

It's really easy to deconstruct things.snowedin.gif


Edited by t_image, 03 December 2019 - 07:58 PM.

  • sharkmelley, Jon Rista and Glass Eye like this

#20 WebFoot

WebFoot

    Surveyor 1

  • *****
  • Posts: 1,595
  • Joined: 02 Jun 2005
  • Loc: Redmond, WA, USA

Posted 03 December 2019 - 08:11 PM

Nicely said, OP.  I know that this is a religious matter to some, but, by and large, we're all making "pretty pictures," and nobody's "pretty picture" is more "realistic" than anyone else's.  Yes, even DSLRs show us an image that bears very little resemblance to what it would look like if we were a few light years away.


Edited by WebFoot, 03 December 2019 - 08:22 PM.


#21 sharkmelley

sharkmelley

    Mercury-Atlas

  • *****
  • Posts: 2,528
  • Joined: 19 Feb 2013

Posted 03 December 2019 - 11:30 PM

Nicely said, OP.  I know that this is a religious matter to some, but, by and large, we're all making "pretty pictures," and nobody's "pretty picture" is more "realistic" than anyone else's.  Yes, even DSLRs show us an image that bears very little resemblance to what it would look like if we were a few light years away.

Does your objection extend to all DSLR long exposure photography at night e.g. night-time landscapes? 

Do you argue that such photos are unrealistic because the image bears little resemblance to what the photographer was able to see at the time?

If so, then at least I respect your consistency even though I consider it a dogmatic position to take.

 

Mark


Edited by sharkmelley, 03 December 2019 - 11:33 PM.


#22 AhBok

AhBok

    Vanguard

  • *****
  • Posts: 2,393
  • Joined: 02 Dec 2010
  • Loc: Lakeland, TN

Posted 03 December 2019 - 11:46 PM

Actually some images are more realistic than others. I’ve seen a lot of images with mostly green stars. I know some tend towards surrealism in their images, which is fine, but does stretch the bounds of reality.

#23 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 11:48 PM

Not like this discussion has been had before.BeatingADeadHorse.gif

Nor have you lacked sharing your ideology.

Sure you post has some basic factual points.

It is also contaminated with lots of opinion, imprecision and faulty reasoning.

 

Reading up on some basics on the internet doesn't make you a subject matter expert on color science,

and certainly you quoting only sources that say what you want them to say like you oft do doesn't mean it's fact.

 

The OP again commits the same logical fallacies post-modern deconstructionists love to offer (if it's not 100% then there's no trustworthiness at all and its "all subjective.")

 

Your failure to again address (and you've never responded to) the challenge

for the argument of repeatability also demonstrates your affinity for card-stacking selective attending fallacy in reasoning.

 

Again people are free to do what they want, but it's ironic you are so passionate in convincing people there is a definitive, certainty of unreliabilityshakecane.gif .

Are you absolutely certain of this?

 

If color processing was so imprecise to the degree that you presume/want,

there would be no point of skilled photographers that do product photography and art archiving for museums.

 

There's a lot of work, effort, and tradecraft you are clearly ignorant about where color imaging does have precision and can demonstratively give

repeatable

and predictable results,

and photons coming from space don't operate with different laws of physics.

 

If you stopped at the philosophical question of "what is the realistic nature of a non-terrestrial object"

then your deconstruction could be valid,

but why not then go further and deconstruct that

we can't as well present an object's true:

  • shape
  • 3 dimensions
  • size

in an image

 

I don't know why we want to fixate on color when we are "crippled" to represent much of anything properly in so many other aspects?

How about smell or taste?

 

But those who pursue AP still make effort, whatever one's particular requirements.

 

Because maybe those who want to pursue matters in a more disciplined effort are willing to make the effort.

It's really easy to deconstruct things.snowedin.gif

The topic was color, not other attributes.  That alone required quite a long post.

 

Comparisons to what photographers do with color in terrestrial are not relevant.   Terrestrial color processing, while it may be done very thoughtfully, is fundamentally easier than astro.  One major difference is that the lighting is far more consistent on Earth.   Another is the dynamic range is substantially larger in astro.  The individual photons are the same, but the collective properties of the group are quite different.

 

We all manipulate color in a much more serious way in astro than anyone does in terrestrial.  The simple white balance becomes a much more complicated "color calibration".  We boost saturation to a degree larger than we would in terrestrial, because it is necessary to stretch the histogram to render these very dim subjects visible to our eyes. And stretching inevitably reduces saturation.  Math.


Edited by bobzeq25, 03 December 2019 - 11:53 PM.


#24 bobzeq25

bobzeq25

    Hubble

  • *****
  • topic starter
  • Posts: 17,392
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 11:56 PM

Nicely said, OP.  I know that this is a religious matter to some, but, by and large, we're all making "pretty pictures," and nobody's "pretty picture" is more "realistic" than anyone else's.  Yes, even DSLRs show us an image that bears very little resemblance to what it would look like if we were a few light years away.

I slightly disagree.  Some images are just cartoons, and definitely unrealistic.  I do think there is a ballpark within which comparisons of "more realistic" are pointless.



#25 WebFoot

WebFoot

    Surveyor 1

  • *****
  • Posts: 1,595
  • Joined: 02 Jun 2005
  • Loc: Redmond, WA, USA

Posted 04 December 2019 - 12:00 AM

I slightly disagree.  Some images are just cartoons, and definitely unrealistic.  I do think there is a ballpark within which comparisons of "more realistic" are pointless.

I'll buy that one can chase an artistic idea down a rathole, and come up with an absurd "interpretation" of an object.  So I'll amend to say that all skilled, thoughtful presentations of heavenly objects pretty much are due the same "respect" as far as color/saturation/contrast are concerned.


  • bobzeq25 likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics