Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

One Shot Color, Interpolation, & Pixel Scale

  • Please log in to reply
9 replies to this topic

#1 SXBB

SXBB

    Explorer 1

  • -----
  • topic starter
  • Posts: 97
  • Joined: 05 Jul 2015

Posted 18 May 2019 - 12:28 AM

I understand how a one shot color camera has an effective lower resolution compared to an equivalent monochrome camera due to the Bayer Matrix.  The total number of pixels is the same after interpolation but does the interpolation have a similar effect to binning?  I'm wondering whether it should somehow factor into calculating your pixel scale for a given OSC/Scope vs Mono/Scope combination?

 

Thanks,

 

Bruce

 

 



#2 kathyastro

kathyastro

    Mariner 2

  • *****
  • Posts: 251
  • Joined: 23 Dec 2016
  • Loc: Nova Scotia

Posted 18 May 2019 - 09:26 AM

It is true that the Bayer matrix reduces your effective resolution.  I couldn't tell you exactly how much the reduction in resolution is without diving deeper into the math than I really want to.  But I don't think the reduction is as much as 2x2 binning.

 

Each pixel contains 33% real data and 67% synthesized data, created from interpolating from surrounding pixels.  The synthesized data is not wrong, in fact it is probably quite close to what the actual data would have been.  It just isn't exact, because a measurement wasn't made for two of the three colour channels.

 

It is not wrong to report your image scale, in arcseconds per pixel, based on the actual pixel size on your sensor.  It just won't be quite as clear as an image with the same sensor without the Bayer matrix.

 

With 2x2 binning, your pixels are effectively twice the size they would have been at 1x1, so you would have to report your image scale according to the larger pixels.


  • psandelle, jdupton and bobzeq25 like this

#3 SXBB

SXBB

    Explorer 1

  • -----
  • topic starter
  • Posts: 97
  • Joined: 05 Jul 2015

Posted 18 May 2019 - 09:35 AM

Hi Kathy,

That’s a very good explanation!

Thanks So Much,

Bruce

#4 freestar8n

freestar8n

    Vendor - MetaGuide

  • *****
  • Vendors
  • Posts: 8376
  • Joined: 12 Oct 2007

Posted 18 May 2019 - 04:58 PM



I understand how a one shot color camera has an effective lower resolution compared to an equivalent monochrome camera due to the Bayer Matrix.  The total number of pixels is the same after interpolation but does the interpolation have a similar effect to binning?  I'm wondering whether it should somehow factor into calculating your pixel scale for a given OSC/Scope vs Mono/Scope combination?

 

Thanks,

 

Bruce

The standard way to process OSC images is to Bayer interpolate each exposure to make it RGB - then align and stack the frames.  That involves Bayer interpolation, where a number of neighboring pixels of different colors combine information to determine the color at a given pixel.  That will inherently have some blurring and loss of detail involved.

 

But some software nowadays allows a form of "Bayer drizzle" - where the individual pixels are mapped directly into separate color channels - and no Bayer interpolation is involved.  The exposures are aligned - and all the red pixels get added into a red frame, green into green, and blue into blue.

 

If the original exposures were dithered somewhat you will end up with full detail in each of the channels - just like mono.  There is only slight loss of efficiency compared to mono because there is a doubling of the green pixels in a Bayer matrix.

 

Here are some examples using Hyperstar:

 

get.jpg?insecure

 

And here is using OSC with an Ha filter and only drizzling the red pixels.  Again you end up with full resolution and no loss of detail even though it is OSC.  But for Ha imaging it is less efficient because only 1 of 4 pixels is gathering light for the image.

 

get.jpg?insecure

 

Frank


  • AhBok likes this

#5 SXBB

SXBB

    Explorer 1

  • -----
  • topic starter
  • Posts: 97
  • Joined: 05 Jul 2015

Posted 18 May 2019 - 08:45 PM

Hi Frank,

 

Great tip!  I found a PI Bayer Drizzle workflow & I'll try it out.

 

Thanks,

 

Bruce



#6 jhayes_tucson

jhayes_tucson

    Fly Me to the Moon

  • *****
  • Posts: 6782
  • Joined: 26 Aug 2012
  • Loc: Bend, OR

Posted 19 May 2019 - 12:18 AM

It is true that the Bayer matrix reduces your effective resolution.  I couldn't tell you exactly how much the reduction in resolution is without diving deeper into the math than I really want to.  But I don't think the reduction is as much as 2x2 binning.

 

Each pixel contains 33% real data and 67% synthesized data, created from interpolating from surrounding pixels.  The synthesized data is not wrong, in fact it is probably quite close to what the actual data would have been.  It just isn't exact, because a measurement wasn't made for two of the three colour channels.

 

It is not wrong to report your image scale, in arcseconds per pixel, based on the actual pixel size on your sensor.  It just won't be quite as clear as an image with the same sensor without the Bayer matrix.

 

With 2x2 binning, your pixels are effectively twice the size they would have been at 1x1, so you would have to report your image scale according to the larger pixels.

 

I don't know what you mean that each pixel contains "33% real data and 67% synthesized data."  The RGB color pixels in the raw data all contain "real data."  The conversion between raw data from a Bayer array and straight RGB data is done by simply convolving a 2x2 kernel with the raw data.  That produces an output with the same number of pixels as the output--and all of that information represents "real RGB data" as well.  That means that an OSC sensor samples the image at the same rate as a monochrome sensor (as you've said;) however, the convolution process will always reduce the resolution of the sensor by blurring sharp features. This reduces the high frequency MTF response of an OSC sensor with respect to a monochrome sensor.

 

John


  • roofkid likes this

#7 RedLionNJ

RedLionNJ

    Gemini

  • *****
  • Posts: 3302
  • Joined: 29 Dec 2009
  • Loc: Red Lion, NJ, USA

Posted 19 May 2019 - 07:02 PM

It is true that the Bayer matrix reduces your effective resolution.  I couldn't tell you exactly how much the reduction in resolution is without diving deeper into the math than I really want to.  But I don't think the reduction is as much as 2x2 binning.

 

Each pixel contains 33% real data and 67% synthesized data, created from interpolating from surrounding pixels.  The synthesized data is not wrong, in fact it is probably quite close to what the actual data would have been.  It just isn't exact, because a measurement wasn't made for two of the three colour channels.

 

It is not wrong to report your image scale, in arcseconds per pixel, based on the actual pixel size on your sensor.  It just won't be quite as clear as an image with the same sensor without the Bayer matrix.

 

With 2x2 binning, your pixels are effectively twice the size they would have been at 1x1, so you would have to report your image scale according to the larger pixels.

This is total BS.

 

Good software "shuffles" the pixels when aligning and stacking.



#8 kathyastro

kathyastro

    Mariner 2

  • *****
  • Posts: 251
  • Joined: 23 Dec 2016
  • Loc: Nova Scotia

Posted 19 May 2019 - 07:45 PM

This is total BS.

 

Good software "shuffles" the pixels when aligning and stacking.

I am aware of how software processes the colour data.  You are missing the point that only one data point is measured in the camera for each pixel, whereas three data points are reported for each pixel in the debayered image.  Yes, the other two points came from legitimate data from surrounding pixels, processed in legitimate ways.  Nevertheless, they are synthesized, not measured at the reported pixel.  The software is reporting three times as much data as was measured.


Edited by kathyastro, 19 May 2019 - 07:45 PM.


#9 freestar8n

freestar8n

    Vendor - MetaGuide

  • *****
  • Vendors
  • Posts: 8376
  • Joined: 12 Oct 2007

Posted 19 May 2019 - 09:09 PM

I am aware of how software processes the colour data. You are missing the point that only one data point is measured in the camera for each pixel, whereas three data points are reported for each pixel in the debayered image. Yes, the other two points came from legitimate data from surrounding pixels, processed in legitimate ways. Nevertheless, they are synthesized, not measured at the reported pixel. The software is reporting three times as much data as was measured.


Hi Kathy.

I agree with what you are saying and how important it is to distinguish an actual red measurement at a red pixel from the g and b values that are conjured up by looking at neighboring pixels. And that conjuring may involve many more pixels than just the neighbors since there are multiple approaches to debayering routines - each with its own artifacts.

At the same time you can use an osc camera and avoid the issue entirely with bayer drizzle. That ends up no different from mono and all the values are actual measurements.

Frank
  • kathyastro likes this

#10 bulrichl

bulrichl

    Vostok 1

  • -----
  • Posts: 109
  • Joined: 27 May 2018
  • Loc: La Palma (Canary Islands)

Posted 20 May 2019 - 04:26 AM

That's right. Juan Conejero strongly recommends to use CFA drizzle instead of bayer interpolation for OSC cameras: https://pixinsight.c...g81536#msg81536

 

See also https://pixinsight.c...g81718#msg81718

 

Bernd




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics