Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Monochrome vs One-Shot-Color – By The Numbers Please

astrophotography CMOS ccd
  • Please log in to reply
130 replies to this topic

#1 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 01 November 2019 - 11:30 PM

Hi Fellow Imagers,

 

   This post will be another marathon length explanation of some work I have done to attempt to dispel a lot of what I consider misconceptions in the ongoing arguments over “which is better – Mono or OSC.” I’ll start with a heads up; there is no clear winner in all scenarios. It all depends on your purposes and, surprisingly, how you shoot with each camera type. In many cases, Monochrome plus filters will be the winner far and away. However, it may come as a surprise to some that OSC has some traits that lead it to outperform the Monochrome approach in some scenarios.

 

   I am going to try my best to avoid the typical “hand waving” posts that characterize this ever-on-going battle of opinions. Instead, I will present here the method, logic, and tools, I used to look at the question from many different angles. I am presenting this information in a step by step manner much like a tutorial so that, hopefully, all can follow my reasoning.

 

   This analysis was started several months back in the Cloudy Nights thread “Dedicated LRGB v. OSC Filter Transmission”. It was recently resurfaced in the on-going thread “Matching a Camera to my Scope.”

 

   Feel free to point out any logic errors I have made or question my assumptions which underlie my methodology. It is quite possible I have overlooked some key factors that come into play. There may also be bugs or errors in the spreadsheet I have built to do all the calculations. However, please make these comments in a way that I can change the underlying mathematical operation of the spreadsheet tool used for the computations involved.

 

Assumptions:

 

The following are key assumptions in my calculations:

  • The sky background is spectrally neutral. The sky is some shade of pure gray / black / white.
    For the portion of the spectrum from 400 nm to 700 nm, an equal number of photons (rate) fall on the telescope aperture during each unit of time regardless of wavelength. This may not be completely valid under significant (colored) light pollution but represents what I think is a reasonable starting point. An arbitrary set of units is chosen so that the measurements are given in photons per second per square arc-second of sky per nanometer of wavelength. These can be later scaled if desired. I ignore any scaling since the choice of photon rate was arbitrary.
     
  • The wavelength response curves published by sensor and filter vendors are reasonably accurate.
    I have digitized the sensor response (QE) curves for a number of different cameras for inclusion in the spreadsheet. I have also done the same for a variety of filters commonly used in astrophotography. In both cases, the spreadsheet is structured in a way that it is easy to add more sensor and filter data. My digitization of the plots is subject to the underlying accuracy of those published graphical representations.
     
  • All of my evaluations use a “Quad” or SuperPixel group of pixels so that direct comparisons between the two camera types can be made more easily. For an OSC camera, I look at one red, two green, and one blue pixel typical of RGGB Bayer Matrix cameras. This Quad is the same as would be used for a “true color” binned SuperPixel DeBayer operation without color interpolation. For Monochrome sensors, I look at a two by two group of pixels (without Bayer filters of course.) This allows for direct comparison between the two sensor types by looking at photons captured by the respective Quads on each sensor.

Methodology:

 

   Here is the general methodology used by the spreadsheet:

  • Grab the response curve for the selected sensor. This is the same as an Absolute QE curve published for the sensor. The data is digitized in 5 nm increments between 400 nm and 700 nm.
     
  • Grab the response (transmission) curve for the filter selected for evaluation. For a filter, this is the transmission response curve published by the vendor.
     
  • At each 5 nm wavelength increment, multiply the sensor’s response at that wavelength by the filter’s response at that same wavelength. Repeat this operation for the full wavelength band-pass under consideration. (This is normally the full range of 400 nm to 700 nm although the spreadsheet can be forced to consider a subset.)
     
  • Plot the resulting product of sensor response times filter transmission for each 5 nm wavelength increment.
     
  • Calculate the accumulated response for each pixel in the Quad. Since I have assumed that each pixel gets 1 photon for each wavelength emitted by the sky, the pixel accumulates 5 photons from the sky for each 5 nm wavelength from 400 nm to 700 nm. This is, of course, modulated by the calculated combined response that was calculated in the previous step.

    For a Monochrome camera, each pixel in the Quad gets the same number of photons since the filter is external to the camera. For an OSC camera, each pixel gets a different number of photons determined by the response of that respective pixel’s Bayer Matrix transmission.

    In both cases, once the pixel’s photons are counted on a per-wavelength increment basis, they are then summed (numerically integrated) for all wavelengths under consideration.
     
  • Finally the pass-band total photon counts are displayed in the spreadsheet. This area allows me to start the evaluation of how each sensor type responded to the light which entered the telescope aperture.
     
  • To do the comparisons, I use a separate external “scratch” spreadsheet. I select the scenario in the main calculation spreadsheet and copy the results to the separate external sheet. I then run additional calculations and again copy the results to the external sheet. Finally, I am able to make the actual comparison of sensors in the external sheet. I will show an example of this later here.

 

Imaging Methodologies:

 

   I think part of the confusion regarding the often raging Monochrome vs One-Shot-Color camera debate comes from differing ways of using each sensor type. There are distinct differences between using an OSC and a Monochrome camera when one considers how the imaging is done. With Monochrome cameras, some of the basic methods used are:

  • LRGB:
    Here the main differences are in how the photographer decides to apportion camera / filter time.

    He may choose to take 1 hour using each filter ending up with a 4 hour session of L, R, G, and B filtered images.

    He may choose to shoot predominantly L and a lesser amount of RGB color. He might end up with 3 hours of L and 20 minutes of each R, G, and B data.

    There are infinite varieties of shooting various ratios of L to RGB data. All will differ in the amount of photons gathered during the session.
     
  • RGB:
    In this case, the photographer chooses to shoot only the color components. He may or may not choose to create an artificial L layer from the RGB data during later post-processing. Shooting only through color filters also changes the total number of photons that are gathered by the sensor / filter combination.
     
  • LRGB or RGB Plus Narrow-Band
    Either of the previous methods can be supplemented with additional data gathered through a narrow-band filter. The extra data is often combined in post-processing only after an RGB processed image is created.
     
  • One-Shot-Color:
    There is only one way to do this in broad-band imaging session. You end up with the same amount of imaging time for each of the four pixels in a Quad. The photons gathered will differ between pixels depending on the Bayer Matrix transmission for each color.

    OSC Data is sometimes supplemented with narrow-band or multi-band data in much the same way as LRGB+NB.

   Surprisingly to some possibly, the exact imaging methodology used influences the result in terms of photons gathered in a Quad during an imaging session. As might be expected, the greater the percentage use of L for Monochrome cameras, the more photons that can be captured. This happens, of course, because L from a Monochrome camera has no filtering (modulation) of the response curve at all.

 

   That's it for this post. Another will follow with a worked example of some comparisons between Monochrome and One-Shot-Color cameras of the same family.

 

 

John


Edited by jdupton, 01 November 2019 - 11:30 PM.

  • dswtan, SteveInNZ, psandelle and 13 others like this

#2 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 01 November 2019 - 11:46 PM

Hi again,

 

   This is part two -- a couple of examples of working the numbers to compare two cameras.

 

A Worked Example By The Numbers

 

   Lets now look at how we can use the spreadsheet to examine the photonic response during an imaging session. For this example, lets use an ASI183MM-Pro camera and a set of AstroDon Series I LRGB filters. We will choose to shoot for time unit of total exposure through each of the four filters. (for simplicity, let’s assume that the sky we are shooting is emitting one photon per time unit.) We will then compare this four time unit long session to a similar session taken with an ASI183MC (OSC) camera and no filters.

 

   Each camera is allowed four time units under the sky. This is one area where many camera comparisons and arguments break down. Some users insist that the OSC can only be used for one hour, for example, while the Monochrome with LRGB filters gets to use four hours imaging the sky. Let’s keep things on an equal basis here.

 

   First we select the camera, filter(s) (one at a time), telescope and auxiliary optics in the spreadsheet. The screenshot below shows the ASI183MM-Pro (Monochrome) sensor and an AstroDon i-Series L filter selected. Although we are not using the telescope section of the spreadsheet, we select a generic refractor telescope of 102 mm aperture and F/7 focal ratio. No Focal reducer or flattener is selected.

 

Figure_01.png

Figure 1 -- An IMX183 Monochrome Sensor QE Plot (With AstroDon i-Series L Filter)

 

   The results of this scenario are shown in the box section labeled “Sensor/Filter Photonic Capture” near the bottom left. There is not much to see since the L filter simply modulates the QE response curve of the Monochrome sensor evenly across the whole selected spectrum. The key takeaway here is that the Quad has captured 896.8 photons with this filter.

 

   Now let’s look at the response with the AstroDon I-Series R filter. This is shown in the next screenshot below.

 

Figure_02.png

Figure 2 -- An IMX183 Monochrome Sensor QE Plot (With AstroDon i-Series R Filter)

 

   Now we see something different. The i-Series R filter only passes light between about 595 nm and 695 nm. This is shown as the gray line in the plot. The gray line represents the filter’s ideal pass-band. The red line on the plot shows the original sensor QE plot multiplied by the filter transmission. The integration of this plot (ie – the area under the curve) is representative of the number of photons captured by the sensor when this filter is present. Again, that is reported in the “Sensor/Filter Photonic Capture” box at the lower left. In this case, the sensor can capture a total of 261.2 photons in the Quad under consideration.

 

   A note about the plots. The “ringing” artifacts you see are not part of the data. They are simply plotting artifacts that appear when you plot sharply changing data using the smoothing filters in the graphics charting routine of the spreadsheet. You have to learn to visually ignore this ringing. Since it is not part of the actual underlying integrated data, it does not affect the values in the photon capture summary box.

 

   Next up is the same data after selecting the AstroDon i-Series G filter.

 

Figure_03.png

Figure 3 -- An IMX183 Monochrome Sensor QE Plot (With AstroDon i-Series G Filter)

 

   This time we see that the sensor combined with the G filter will have captured 258.9 photons within the Quad.

 

Finally, let’s look at the sensor combined with the AstroDon i-Series B filter gives us.

 

Figure_04.png

Figure 4 -- An IMX183 Monochrome Sensor QE Plot (With AstroDon i-Series B Filter)

 

   One last time, we integrate the area under the curve to find the number of photons captured by a Quad. Doing so, we see that it will have captured 306.7 photons in the unit of time it was exposed.

 

   So, now let’s look at how many photons were gathered by a single Quad (representative of the rest of the chip). Let’s go back to the assumption that we were going to expose each filter for the same amount of time. All we have to do is add up the total number of photons captured to make up our LRGB image. We can stack our filtered Quads directly to the appropriate color layer. Looking down on the Quad (“SuperPixel”) we find that we have 896.8 photons in the L channel, 261.2 photons in the R channel layer, 258.9 photons in the G channel, and 306.7 photons in the B channel. We gathered a total of 1,723.6 photons in our 4 time units of exposure.

 

   It is now time to see what happens if we select the ASI183MC (OSC) camera. In the case of the OSC camera, we do not need to select a filter. The Bayer Matrix filters are built onto the chip. Their respective responses are shown in Figure 5 below.

 

Figure_05.png

Figure 5 -- An IMX183 One-Shot-Color Sensor QE Plot (With no filters being used)

 

   We now see different values for photon capture (area under the curve) for each color in the results area. For each unit of time exposed to our evenly gray sky, the pixels under the Red Bayer filters will capture 80.7 photons, the (two) pixels under Green filters will capture 221.8 photons (combined), and the pixels under the Blue filters will capture 64.3 photons. The total number of photons captured in a single unit of time is then 366.9 photons.

 

   To make things equal, though, we need to give the OSC equal time under the sky. We must multiply the photon capture numbers by 4 since the Monochrome camera was allowed 4 total time units to capture data. Doing that multiplication, we find that the OSC has then collected 1,467.4 total photons compared to the 1,723.6 photons for the Monochrome version of the same camera using high transmission AstroDon LRGB interference filters.

 

   The Monochrome camera has won this head to head comparison by a factor of 1.174 times. It is a win but 17.4% is in my mind is far from the runaway that some might have predicted. As mentioned before, this lead can be extended by having the Monochrome camera spend more time gathering L filter data and much less on the RGB color portion. Gathering L data loses practically nothing. For instance, what if the exposed L for 9 time units and RGB for only 1/3 time units each. We then have 10 time units of data and the Monochrome camera will score 9 x 896.8 + (261.2+258.9+306.7)/3 = 8,347.0 total photons captured. In the same 10 time units, the OSC will have only scored 10 x 366.9 = 3,669 total photons. Now the Monochrome camera has beat the OSC by a factor of 2.28 times. This is indeed a convincing win. The cost is some loss of color fidelity from the Monochrome camera.

 

   Speaking of color fidelity, some astrophotographers advocate gathering only RGB data. This gives maximum color fidelity in the final image. Often a synthetic L channel will be crated from the RGB data and sharpened before being added back to the RGB image. What would this strategy look like for our camera comparison?

 

   Let’s assume we will gather only RGB data with the Monochrome camera. Let’s further assume we will spend a single time unit on each color filter (using the same AstroDon i-Series filters.) Back to the data we have already calculated, we can see that the RGB-only session will gather 261.2+258.9+306.7 = 826.8 photons in the three time units under the sky. Our OSC camera can gather 3 x 366.9 = 1,100.5 photons in that same three time units under the sky. This comparison happens to let the win go to the One-Shot-Color camera. It will capture 1.33 times more photons in this scenario.

 

   One valid complaint against OSC cameras is the need for interpolation during the DeBayer process. This is sometimes even touted as leading to poor color resolution by astrophotographers who choose to bin their color filters 2x2 when shooting LRGB images. The loss of color resolution is a concern but in my opinion is often overrated.

 

   A simple test can show you how much color error is introduced by the interpolation done using VNG in the DeBayer step. Simply take an OSC image and clone it. With one copy, DeBayer using the SuperPixel method. This results in an image of one half the dimensions of the original. Now take the second cloned copy and DeBayer it using VNG interpolation. Re-Sample the result to one half its original dimensions. Now subtract one from the other while including a small pedestal to prevent clipping. Examine the result. You can separate the color channels of the difference frame to examine each in detail if your wish. Having done this, I find that the color shift amounts to less than 0.1% differences in the color channels compared to the pure SuperPixel DeBayer operation.

 

   I should mention the one area where Monochrome cameras really shine. That is in narrow-band imaging, especially when done under light polluted skies. There is no comparison since using an OSC with a narrow-band filter involves two filter transmission curves being applied to the sensor’s QE response curve. You can do narrow-band work with OSC sensors but it will not quite live up to what the Monochrome can do.

 

   If we run the numbers using the spreadsheet as before for the two versions of the IMX183 sensor and an AstroDon Oiii filter, we find that the Monochrome camera will capture 15.9 photons per unit time while the OSC version of the same sensor will only capture 9.0 photons per unit time. That is a 1.76 times win for the Monochrome camera.

 

   The OSC cameras can take advantage of the newer multi-band narrow-band filters that are now available. It is even possible to extract the proper color data for each wavelength component so that the multiple bands can be easily separated and later recombined with RGB OSC data in post-processing when using PixInsight. An example of this is outlined in a post I made to another thread here on Cloudy Nights a week or so ago. If interested in this use of the spreadsheet see Post #15 in the thread “bicolor (HOO) in PixInsight with a dual narrow-band and OSC - help”.

 

   I am tired of typing and trying to proofread all this. I will close it out for now. I will try to post a few more examples as well as a sanitized version the spreadsheet tomorrow. I also want to post some analysis of some of the more common misconceptions that invariably come up in these discussions. Stay tuned; there will be more to come after I give my brain a rest for a while.

 

   In the meantime, feel free to comment or point out errors in the methodology. I only ask that you back it up with numbers of your own rather some of the hand-waving that always derails these threads. I have tried to keep the analysis data-centric rather than the typical “but OSC can only use one out of four pixels, so it cannot compare” type of discussions. We have all the data we need. It consists of sensor QE curves, filter transmission curves, and simple physics. I may not have put it all together correctly but that is for others to point out where the methodology falls apart.

 

 

John


Edited by jdupton, 02 November 2019 - 12:23 AM.

  • dswtan, SteveInNZ, psandelle and 9 others like this

#3 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 36,565
  • Joined: 27 Oct 2014

Posted 02 November 2019 - 12:45 AM

"Since I have assumed that each pixel gets 1 photon for each wavelength emitted by the sky, the pixel accumulates 5 photons from the sky for each 5 nm wavelength from 400 nm to 700 nm."

 

This is one place where I think your theoretical results don't align so well with actual imaging experience.  Astro targets usually have an uneven distribution of wavelengths emitted.  There's little that's pure green.  Red and blue tend to dominate.  It will vary with the target, of course.

 

So the fact that there are two green pixels of every 4 (which works well for terrestrial subjects, which do tend to have a lot of green) is inefficient for astro.

 

While OSC is not a bad approach, I don't think it's a good as your theoretical numbers.  Particularly in light polluted skies.

 

The best imagers prefer mono camera and filters.  They're not stupid.  They have come to the conclusion that the advantage to them justifies the considerable money spent, which is the main disadvantage of the approach.

 

The bottom line is that this analysis, like so many purely theoretical analyses on Cloudy Nights, is driven by what factors are included in the numbers, and what are left out.   Contrary to the assertions of many theoreticians here, empirical methods (ie actual experience) are frequently superior due to this issue.  They automatically incorporate all the factors important to actual imaging, and precise theoretical approaches are often simply impractical, due to the real world complexity.  I believe that's true here.  More about that below.  What I'm saying is that characterizing empirical approaches as "hand-waving" is simply a debating technique that is not illuminating.

 

I've imaged both ways.  In my Bortle 7 skies, with dim targets, there is no doubt in my mind that mono plus filters has more than a 17% advantage in speed of data acquisition.  I'm hardly alone in making that judgment.

 

None of which is to say that OSC is not capable of making fine images.  But it does gather astro data less efficiently, and that becomes more important as signal goes down, and light pollution increases.  Ruben Kier, the very experienced author of "The 100 Best Astrophotography Targets", carefully distinguishes which targets are suitable for one shot color (bright ones, which are sending down a lot of data), and which are less suitable (dim ones, which can be sending down less than one photon per second).  He's not talking about small differences.

 

And that's the key bottom line.  Your analysis would lump them all together.  I don't think that's correct, I (like Ruben Kier, and many others) think the target (and the site conditions) does make a difference in the effectiveness of the two approaches.  I think any theoretical approach that does not incorporate the characteristics of the target and the site, and does not reflect the considerable differences in the effectiveness of the two approaches vis a vis those factors, is incomplete, and not superior to empirical approaches.

 

Color fidelity is another real world issue that is important to good images, the sharp distinctions of interference filters compared to tinted glass, are useful there.


Edited by bobzeq25, 02 November 2019 - 01:30 AM.

  • psandelle, arbit, PirateMike and 2 others like this

#4 Coconuts

Coconuts

    Apollo

  • *****
  • Posts: 1,006
  • Joined: 23 Sep 2012

Posted 02 November 2019 - 01:21 AM

John:  Thank you for your detailed and objective analysis of a complex (and, at times, controversial) technical aspect of astrophotography!  It is also quite timely.  I have been struggling with how best to approach the upcoming availability of astrocameras based upon Sony's excellent full frame IMX455 BSI CMOS sensor. Both QHY and ZWO will shortly be offering both OSC and monochrome versions, and I have been on the fence regarding both which company, and which variant to purchase.  I prefer simplicity, and so have been leaning toward an OSC camera, at least to start.  Your analysis has certainly encouraged me in this regard.  I have also recently purchased a 2" Optolong L-eNhance filter, which seems from other's work to at least approach the results of narrowband work with mono cameras.  I had assumed that I would eventually add a monochrome version and a filter wheel to the mix, but dreaded the added complexity, focusing between filters, and additional mass.  One astro-imager I am building (a 254 mm diameter f/3.6 Wynne-Newtonian) will have a direct optical path (no secondary), with the camera centered in front of the primary.  That required some tricks to work in a mono camera without obstructing the light path; either a one at a time filter drawer, or a complex orthogonal folding of four filters.  Your analysis is very helpful, and strongly suggests that this additional cost and complexity might be forever avoided.  I should add that I routinely assess complex optical systems at work with cutting-edge monochrome imaging sensors and ion-sputtered filters in fluorescence microscopy, so I am technically qualified to comment upon your work.  My only question would be if you are willing to post your spreadsheets in a publicly available Google Sheets version.

 

Thanks again!

 

All the best,

 

Kevin



#5 sharkmelley

sharkmelley

    Cosmos

  • *****
  • Posts: 8,225
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 02 November 2019 - 10:10 AM

That's a very interesting and thorough analysis!  A most enjoyable read.

 

I notice you made the assumption that white light has equal photon rates at each wavelength.  I would have thought a better assumption would be that white light has a flattish spectral power distribution (SPD).  Now, since photon energy is inversely proportional to wavelength this would mean that for white light, the red photon rate is much higher than the blue photon rate.

 

Then again, what we call white light does not have a completely flat SPD in any case.  Roughly speaking what we call white is solar radiation (which is more or less a black body SPD, having a slight hump in the visible spectrum) modulated by atmospheric absorption.

 

How much difference that would make to your analysis is anyone's guess but I'm pretty sure it won't change the main conclusion(s).

 

Mark


Edited by sharkmelley, 02 November 2019 - 10:12 AM.

  • Jon Rista likes this

#6 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 02 November 2019 - 10:43 AM

Hey folks,

 

   I have uploaded the spreadsheet to DropBox. Anyone who wishes to play with it can grab a copy there. It is saved in OpenOffice Calc ODS spreadsheet format. Most common modern spreadsheet programs should be able to open and run it without problem. It contains no macros. It is just straight up calculations. It is protected with respect to user changes other than input fields. If you have questions about the calculation methods, just ask. I make heavy use Array Calculations where intermediate results are stored temporarily in memory and then discarded after the cell is evaluated. Doing so makes the spreadsheet much more compact since intermediate results don't have to be saved.

 

   The current version uploaded is Rev-D1. I will continue to update the version there as errors are found or additional conditions are added to allow for a better simulation.

 

   Go to the following link and look for the file "System_QE_Calculator_Rev-D1.ods". (It is currently the only file there. I may later upload the larger spreadsheet I use for imaging analysis. This one is an extracted page/sheet out of the full master calculator.)

 

https://www.dropbox.com/sh/o2nsex9yvqj53qk/AADFdbT2omFEPTsgng-YJOo5a?dl=0

 

 

 

John

 

PS: I will start work on responses to the other posts here but have a pretty busy day today. It may be later this evening before I can answer in full.


  • PirateMike likes this

#7 PirateMike

PirateMike

    Skylab

  • *****
  • Posts: 4,356
  • Joined: 27 Sep 2013
  • Loc: A Green Dot On A Blue Sea

Posted 02 November 2019 - 11:39 AM

I said it once and I'll say it again...

 

Some threads here on CN make me feel dumb, but I'll be trying to improve. 

 

Great work John, I'll try my best to learn about, and understand, what is going on here. waytogo.gif waytogo.gif waytogo.gif Google is my friend.

 

 

Miguel   8-)

 

 

.


  • dswtan, jdupton, leviathan and 3 others like this

#8 AKHalea

AKHalea

    Apollo

  • *****
  • Posts: 1,286
  • Joined: 17 Jul 2016
  • Loc: Houston, Texas, USA

Posted 02 November 2019 - 04:47 PM

John : That is a vry interesting theoretical analysis. It sort of answers my question (which was not yet asked by me, but I always had it). The question is about "integration time" reported for mono vs for OSC cameras. I feel they are not quite comparable and perhaps misleading. I understand that the integration time for mono is simply be the total of all actual time spent using different filters. However, to image the same subject with an OSC, your analysis suggests that there may be a multiplier effect on the OSC time.

 

Let me illustrate my question with an example - Suppose a total of 12 hours were spent on a DSO with a mono camera - say 6 hrs L, 2 hours each of RGB filters. Now with an OSC to get the same image quality (presumably the same number of photons to the sensor?), how much time would be needed with the OSC? I have not done the calculations, but it seems that your analysis suggests perhaps 14 (12x1.17) hours will be sufficient? That is substantially less than what one would think based on the simplistic notion that "mono is 4 times more efficient at gathering photons" so you would need 48 hours of time with the OSC to gather the same amount of data.  

 

I must disclose that I am an OSC camera user because I do not like too much complexity. Mono camera with filters is the epitome of complexity for me frown.gif ...... Anil


  • Hobby Astronomer likes this

#9 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 36,565
  • Joined: 27 Oct 2014

Posted 02 November 2019 - 05:54 PM

John : That is a vry interesting theoretical analysis. It sort of answers my question (which was not yet asked by me, but I always had it). The question is about "integration time" reported for mono vs for OSC cameras. I feel they are not quite comparable and perhaps misleading. I understand that the integration time for mono is simply be the total of all actual time spent using different filters. However, to image the same subject with an OSC, your analysis suggests that there may be a multiplier effect on the OSC time.

 

Let me illustrate my question with an example - Suppose a total of 12 hours were spent on a DSO with a mono camera - say 6 hrs L, 2 hours each of RGB filters. Now with an OSC to get the same image quality (presumably the same number of photons to the sensor?), how much time would be needed with the OSC? I have not done the calculations, but it seems that your analysis suggests perhaps 14 (12x1.17) hours will be sufficient? That is substantially less than what one would think based on the simplistic notion that "mono is 4 times more efficient at gathering photons" so you would need 48 hours of time with the OSC to gather the same amount of data.  

 

I must disclose that I am an OSC camera user because I do not like too much complexity. Mono camera with filters is the epitome of complexity for me frown.gif ...... Anil

Yes, 4 times as much time is absurd.  I do think it's somewhat more than 1.17 times for your LRGB example (using all the pixels gathering all the light for the 6 hours in L is very efficient, in terms of providing detail for your eyes, which see detail in L, not so much in RGB), but that it depends on target and site conditions.

 

One thing has not been explicitly said, although it's embedded in John's analysis.  The arrival of cooled CMOS cameras with low read noise and high quantum efficiency has closed the gap between OSC and mono plus filters somewhat.

 

This is just a choice, not right/wrong.  The only common mistake here is to assume gathering the 3 channels simultaneously with a Bayer filter is _more_ efficient.


Edited by bobzeq25, 02 November 2019 - 05:58 PM.

  • psandelle, jdupton, Hobby Astronomer and 1 other like this

#10 bortle2

bortle2

    Ranger 4

  • -----
  • Posts: 326
  • Joined: 18 Sep 2019

Posted 02 November 2019 - 08:00 PM

The arrival of cooled CMOS cameras with low read noise and high quantum efficiency has closed the gap between OSC and mono plus filters somewhat.

Yes -- in terms of the absolute time needed to collect certain number of photons. Say, an old color (Bayer) sensor had peak QE of 60% and you needed 3-hour exposure with it for some target. If... if the 1.17 coefficient is correct, one should be able to accomplish the same task in about 2.61 hours using monochrome counterpart. Now, a new color sensor with 90% peak QE should be able to do the same in just 2 hours (similar monochrome in about 1.74 hours, but that's not the point: exposure time difference does shrink).

 

Then there is the issue of image acutance, which this article illustrates pretty nicely. The difference in acutance between OSC/monochrome is very similar to that between Bayer and Foveon sensors in conventional cameras. But that's changing as well: 60Mp IMX455 has so many so densely packet pixels that vast majority of astrophotographers will be shooting them using 2x2 binning (or downsample in post), thus making sure each pixel has color information, effectively negating one of the benefits of monochrome+filters combos.

 

All that said, I do think that the 1.17 coefficient that the OP came up with is

 

1) a bit optimistic; a more strict approach has to take into account big differences between targets; perhaps a statistical approach including brightness analysis (to get a better real-world photon distribution) of a lot of actually shot R, G, and B frames for different targets would be more convincing; and it's IMHO likely to show that two G sub-pixels in each "super-pixel" is a deficiency,

 

2) needs to be verified using sensor (or rather sensors) for which manufacturer provides official QE curves; Sony is notoriously bad at that and, to the best of my knowledge, official QE curves for IMX183 (which is analyzed in the OP) are not available from Sony; whatever guesswork camera manufacturers then publish themselves is, well, of suspect accuracy, and sometimes is just a mess.

 

Bottom line (IMHO!): the OP is a very good first step, but a lot more has to be done for the results to be truly convincing.


  • psandelle and Hobby Astronomer like this

#11 freestar8n

freestar8n

    MetaGuide

  • *****
  • Freeware Developers
  • Posts: 13,909
  • Joined: 12 Oct 2007
  • Loc: Melbourne, Australia

Posted 02 November 2019 - 08:49 PM

I think there are a few things to keep in mind in these comparisons - and they don't require too much detailed analysis:

  1. RGB, LRGB, and OSC all produce different final results - so it's hard to say which one is best in a quantitative way
  2. OSC is not as lossy as people think - and depending on the spectrum of the object being imaged, the win, in terms of photons gathered, can probably go either way.  They are quite comparable in terms of photons gathered
  3. The duplicate G pixel may not be such a bad thing since the eye is most sensitive there

Another thing is that most mono filters have a gap in them to help block out some light pollution lines - but I think most people have broadband light pollution and those gaps may only serve to lose good signal that could otherwise be gathered.

 

Also - since OSC filter bandwidths intentionally overlap to mimic the eye's response, I wonder if it wouldn't be helpful to have a similar partial overlap of mono filters.

 

Finally - if you want mono and you want to capture many photons - you can do what I do, which is use Sloan filters to cover a much wider region of the spectrum and with no gaps.  It ends up being a form of false color - but if it's photon counts you want, then one way is to expand into the UV and IR.

 

Frank


  • AhBok likes this

#12 bortle2

bortle2

    Ranger 4

  • -----
  • Posts: 326
  • Joined: 18 Sep 2019

Posted 02 November 2019 - 09:45 PM

since OSC filter bandwidths intentionally overlap to mimic the eye's response, I wonder if it wouldn't be helpful to have a similar partial overlap of mono filters.

That's a very interesting topic in itself. And it's all far more complicated than just having an overlap in adjacent bands; there's also the issue of capturing (and reproducing) indigo and violet colors.

 

There's an interesting video on the subject, Why you can't take a good picture of a rainbow, particularly this part. The gist is that we have a second, smaller peak of sensitivity of red cones in the far left side of the visible spectrum (which makes it possible for us to perceive indigo and violet colors the way we do), and at least some of the OSC sensors are capable of mimicking that... Few response curves to illustrate:

 

Nikon D80 sensor:

 

Nikon_D80.png

 

Canon 450D sensor:

 

Canon_450D.png

 

A small [potential] win for OSC.


  • psandelle and jdupton like this

#13 sharkmelley

sharkmelley

    Cosmos

  • *****
  • Posts: 8,225
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 03 November 2019 - 03:11 AM

There's an interesting video on the subject, Why you can't take a good picture of a rainbow, particularly this part. The gist is that we have a second, smaller peak of sensitivity of red cones in the far left side of the visible spectrum (which makes it possible for us to perceive indigo and violet colors the way we do), and at least some of the OSC sensors are capable of mimicking that...

Take a look at the response curves of the IDAS RGB filters:

https://www.sciencec.../idas/type4.htm

 

They have overlapping response curves and they include the red bump in the blue part of the spectrum.  It means they should have better colour fidelity than traditional RGB filters i.e. improved ability to reproduce colours the way humans see them.

 

Mark


Edited by sharkmelley, 03 November 2019 - 03:13 AM.

  • bortle2 likes this

#14 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 36,565
  • Joined: 27 Oct 2014

Posted 03 November 2019 - 10:22 AM

The thing is, through processing, one can blend the cleanly separated rgb data from interference filters _any way_ you want to.  So you can do anything.  If the response curve of the eye is an issue, you can address it as you see fit.

 

But, once the color data is smeared by a Bayer matrix filter, it's irretrievably damaged.   You can no longer do anything, some data is just lost.

 

The thing about 2 green pixels is that, yes, your eye is more sensitive to green.  And terrestrial subjects tend to have a lot of green.   So having 50% of the pixels devoted to green makes sense, no doubt why it's done that way.

 

Astro objects have less green.  So devoting 33% to green (RGB) or less (LRGB) makes more sense for astro.  The buzzword I've used for that is that it's more "efficient".  More effective might be better communication.

 

Mandatory disclaimer.  None of that means OSC is bad, or that you can't do nice images with it.  But mono plus interference filters is better. 

 

How much better can be endlessly debated.  It depends on what factors are put into the analysis, what factors are left out.  One has received too little attention.  The fact that our eyes see detail in L, not RGB.  Doing LRGB and binning color data (which is also more effective) goes in and out of fashion, but it has a solid biological basis.  Nothing gets you better images faster, _especially_ in light polluted skies.

 

There's are reasons why really expensive mono plus interference setups are so popular among the very best imagers.  It's not because it's a status symbol.  That fact integrates a great many factors into the analysis, because that's what people's preferences do.


Edited by bobzeq25, 03 November 2019 - 10:28 AM.

  • psandelle likes this

#15 sharkmelley

sharkmelley

    Cosmos

  • *****
  • Posts: 8,225
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 03 November 2019 - 11:18 AM

The thing is, through processing, one can blend the cleanly separated rgb data from interference filters _any way_ you want to.  So you can do anything.  If the response curve of the eye is an issue, you can address it as you see fit.

 

But, once the color data is smeared by a Bayer matrix filter, it's irretrievably damaged.   You can no longer do anything, some data is just lost.

 

On the contrary.

 

Let's go back to the earlier example of a rainbow.  If you image a rainbow using your sharp cut-off filters you'll see adjacent bands of solid red, solid green and solid blue with no continuous transition between them.  Yes, you can process it _any way_ you want but you'll never retrieve the smooth transitions of colour because the colour data is irretrievably damaged.

 

Mark


Edited by sharkmelley, 03 November 2019 - 11:21 AM.

  • SteveInNZ, Jon Rista and Spaceman 56 like this

#16 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 12:42 PM

Bob,

 

This is one place where I think your theoretical results don't align so well with actual imaging experience.  Astro targets usually have an uneven distribution of wavelengths emitted.  There's little that's pure green.  Red and blue tend to dominate.  It will vary with the target, of course.

[snip]
 

The bottom line is that this analysis, like so many purely theoretical analyses on Cloudy Nights, is driven by what factors are included in the numbers, and what are left out.   Contrary to the assertions of many theoreticians here, empirical methods (ie actual experience) are frequently superior due to this issue.  They automatically incorporate all the factors important to actual imaging, and precise theoretical approaches are often simply impractical, due to the real world complexity.  I believe that's true here.  More about that below.  What I'm saying is that characterizing empirical approaches as "hand-waving" is simply a debating technique that is not illuminating.

 

 

   You have hit on two key points here. You are absolutely correct in saying that a theoretical model is only as good as the information in embodies. One of the things I want to get out of this thread is what is my model missing and how can it be made better.

 

   The exact same thing must also be said about empirical evidence and “common knowledge.” If that evidence is to be of any use, it must include what information is embodied and what is being left out or left unsaid. An example might be listening to two automotive enthusiasts having a conversation like this:

 

Car Guy: “Any car is faster than any pickup truck any day. They cannot be beat on the track.”
Truck Guy: “No way. Any of my trucks can beat any car you have on 'the track' without a problem.”
Car Guy: “OK. Let’s try it. Name the time and place.”
Truck Guy: “Sure, make it Mike’s Peak Track, the first of next month.”

 

   Like the auto guys, we hear a lot of anecdotal and empirical evidence from both camps in the Mono vs OSC arguments. Without all of the conditions being described and fully disclosed, I see the many vague statements as a form of “hand waving” to skirt the details of any real meaningful answer. Rather than saying “The best imagers prefer mono camera and filters” without any specifics does seem to simply base a conclusion on an "appeal to experts" style argument with no pertinent details at all. We really need give the conditions and details for such statements to be useful.

 

   My key point in this thread is to say that both Monochrome and One-Shot-Color cameras have their place. However, I will continue to maintain that that place is not just only from dark skies on very bright targets. Like the car guys talking past each other, the differences of opinion about camera types often don’t even begin to include pertinent details of usage patterns. If there is one thing I would want others to get out of this discussion, it is that the way we shoot our targets (and which targets) with each camera type will sometimes be more important than which camera type we chose. I think it would be nice to have a way to model our usage patterns and then figure out whether one camera type or another is most appropriate.

 

   You yourself came back to this point in your post. Sky conditions and target are important. My model will need a way to account for that. The spreadsheet can account for target characteristics in a circuitous way already but needs a better way to do it. (As it exists now, you would need to run the calculation of a background sky and a separate calculation for an emission filter/target and then combine the results.) I will need to figure out a better way to model target aspects.

 

   I suspect from your postings in the past on this general subject that you are a “nebula guy” hence the statement “Astro targets usually have an uneven distribution of wavelengths emitted.” I, on the other hand, am more of a galaxy and cluster guy. Those represent two very different types of targets. You appear to give more importance to narrow-band emission sources of light in your photography where I look mostly at broad-band emission targets. That alone can dictate a usage pattern that translates directly into our feelings on camera types. I recognize that each has a place. That is why I own both Monochrome and One-Shot-Color cameras. I seem to recall you also have both types. I even have some desire to use both on the same target from time to time. That is why I picked an OSC camera with similar pixel size and FOV to my trusty Mono IMX694.

 

   Anyway, while I agree that the model I currently have doesn’t adequately cover some aspects it should, my own experience and studying the results from a lot of calculations, tells me that there are usage cases where an OSC can gather data faster than a Monochrome camera. I realize that you do not believe that. We will have to continue to disagree that that could never happen. Who knows, once the model incorporates more conditional terms, it may disprove my assertion.

 

   One final point I’ll add is that as a scientist, I do believe that anything can be modeled with mathematics. You appear to be quite suspect of any sort of theoretical modeling. I like to confirm things when my gut feeling about a topic tells me it could be wrong. Modeling can show whether it's correct or can be proved incorrect. I am not adverse to common knowledge examples and empirical evidence when they are presented with complete details so that an assessment of applicability can be made. However, I also usually feel better when it can be modeled so that more subtle aspects can be more fully understood.

 

 

John


  • psandelle, AhBok, RazvanUnderStars and 1 other like this

#17 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 12:48 PM

Kevin,

 

 I have been struggling with how best to approach the upcoming availability of astrocameras based upon Sony's excellent full frame IMX455 BSI CMOS sensor. Both QHY and ZWO will shortly be offering both OSC and monochrome versions, and I have been on the fence regarding both which company, and which variant to purchase.  I prefer simplicity, and so have been leaning toward an OSC camera, at least to start.  Your analysis has certainly encouraged me in this regard.  I have also recently purchased a 2" Optolong L-eNhance filter, which seems from other's work to at least approach the results of narrowband work with mono cameras.

 

   Don’t read too much into specific results from a single set of calculations. As Bob pointed out in the post prior to yours, the spreadsheet does not directly calculate light capture for all target types. The simulations I posted in my first two posts cover a broad-band target which is mostly white-balanced. The results will be very different if you consider a narrow-band use case.

 

   You can simulate that by selecting an OSC sensor and then selecting the Triad Ultra filter. Record those results. Now select a Mono sensor and run the calculations for both a Ha and Oiii filter. You will find that the Mono plus narrow-band filters will rule the day.

 

   As you make your decision on camera type, think about what you want to shoot. If you plan to seriously pursue mostly emission targets (nebulae), then you might want to lean towards Monochrome and set of filters. If you tend to enjoy broad-band targets (galaxies and star clusters), then you can consider an OSC with the multi-band filters.

 

   And by the way, OSC is not really much simpler to process than Mono. Yes, with mono ,you have to take additional Flat Frames and deal with focusing for each filter but the processing can often be where the larger differences lie. If you shoot OSC plus a multi-band filter (Like L-eNhance or Triad Ultra) the processing of your images can have more steps than mono processing. You may find OSC processing more complex than you thought.

 

   Good luck on your final choices.

 

 

John


  • psandelle and bobzeq25 like this

#18 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 12:53 PM

Mark,

 

I notice you made the assumption that white light has equal photon rates at each wavelength.  I would have thought a better assumption would be that white light has a flattish spectral power distribution (SPD).  Now, since photon energy is inversely proportional to wavelength this would mean that for white light, the red photon rate is much higher than the blue photon rate.

 

Then again, what we call white light does not have a completely flat SPD in any case.  Roughly speaking what we call white is solar radiation (which is more or less a black body SPD, having a slight hump in the visible spectrum) modulated by atmospheric absorption.

 

   Yes, Bob also touched on this point. My assumption of fixed photon flux at all wavelengths was meant to simplify calculating the differences in cameras. However, it is not hard to do and would be a simple change to the spreadsheet. I even considered adding it initially.

 

   The problem I ran into was that I could not find consistent data which described the night sky’s flux across the visible wavelength spectrum. I did find a number of research articles published by major observatories which gave similar data in terms of light pollution. The problem arose when I noticed that they were all significantly different. Their light pollution plots by wavelength were different due to different locations and conditions. Perhaps if I could find a similar report for a very dark known site like Atacoma, it could serve as a proxy for the actual night sky.

 

   I sort of dismissed the data I found on SPD as it relates to visual perception because the response of the eye is different from that of a sensor we use (even though they are somewhat similar in many cases.)

 

   If you have come across any plots that could be used as “background target” data for inclusion in the spreadsheet, I will include that in the next revision of the modeling spreadsheet. The next changes will likely include both target and background selectors to allow easier modeling of something other than a (sort of) white balanced target and sky.

 

 

John


  • psandelle and Jon Rista like this

#19 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 01:01 PM

Anil,

 

John : That is a vry interesting theoretical analysis. It sort of answers my question (which was not yet asked by me, but I always had it). The question is about "integration time" reported for mono vs for OSC cameras. I feel they are not quite comparable and perhaps misleading. I understand that the integration time for mono is simply be the total of all actual time spent using different filters. However, to image the same subject with an OSC, your analysis suggests that there may be a multiplier effect on the OSC time.

 

Let me illustrate my question with an example - Suppose a total of 12 hours were spent on a DSO with a mono camera - say 6 hrs L, 2 hours each of RGB filters. Now with an OSC to get the same image quality (presumably the same number of photons to the sensor?), how much time would be needed with the OSC? I have not done the calculations, but it seems that your analysis suggests perhaps 14 (12x1.17) hours will be sufficient? That is substantially less than what one would think based on the simplistic notion that "mono is 4 times more efficient at gathering photons" so you would need 48 hours of time with the OSC to gather the same amount of data.  

 

   My own take on “integration time” is simplistic. I have assumed in the spreadsheet that equal sky time is always allotted to whichever camera is being used. If you image for 10 hours total with a mono camera, you have ten hours of integration time. I do not include any time penalty for filter changes and such. When I do my comparisons in an external spreadsheet, I take the unit time values from the calculation and scale them as appropriate to equal “sky time.” The intent is to simulate having two scopes set up side by side in exactly the same configuration and letting each start and end a session simultaneously.

 

   For your example, we can use the same values as my example in Post #2.
Mono: L → 6 * 896.8 = 5,380.9; 
      R / G / B → 2 * (261.2 + 258.9 + 306.7) = 1,653.6
Mono Total → 5380.9 + 1653.6 = 7,034.6

 

OSC Total → 12 * 366.8 = 4,402.2

 

    Now, since the OSC is gathering photons only 4402.2 / 7034.6 = 62.6% as fast as the monochrome camera, you would need to expose the OSC for 12 / 0.626 = 19.2 hours to have gathered the same amount of light.

 

    Note that the 1.17x factor only applies to the one specific scenario I outlined in that example. Your example needs to be calculated differently even though we use the same base number for photons gathered per unit time.

 

   This is a key point that I must not have made clearly enough. The ratios showing how well Mono or OSC do are very dependent on exactly how the session was conducted. You must run the calculation to get the base photon capture rate per unit time but then each usage pattern must have its own calculation before making the comparison.

 

   I hope that makes sense. I'm sorry if my previous post confused you. Once I get caught up, I will post some more example uses and comparisons to make it easier to understand.

 

 

John


Edited by jdupton, 03 November 2019 - 01:22 PM.

  • psandelle, Jon Rista and AKHalea like this

#20 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 01:18 PM

Bob,

 

Yes, 4 times as much time is absurd.  I do think it's somewhat more than 1.17 times for your LRGB example (using all the pixels gathering all the light for the 6 hours in L is very efficient, in terms of providing detail for your eyes, which see detail in L, not so much in RGB), but that it depends on target and site conditions.

 

One thing has not been explicitly said, although it's embedded in John's analysis.  The arrival of cooled CMOS cameras with low read noise and high quantum efficiency has closed the gap between OSC and mono plus filters somewhat.

 

This is just a choice, not right/wrong.  The only common mistake here is to assume gathering the 3 channels simultaneously with a Bayer filter is _more_ efficient.

 

   Thanks, you are correct. In Anil’s case, the ratio between Mono and OSC is 1.6x as I showed in my reply to him. As you say 4x is not even in the ballpark and 1.17x doesn’t apply in his example. The exact usage method for a Mono camera plus filters changes the ratio significantly.

 

   You are also right about CMOS cameras changing the game somewhat. However, the spreadsheet handles both CCD and CMOS sensor equally well.

 

  Regarding the comment “mistake here is to assume gathering the 3 channels simultaneously with a Bayer filter is _more_ efficient” should have a qualification added. I agree with you if you are using RGB interference filters with your mono camera. If you're using other filters, that may not be the case. Bayer filters are of the type I classify as “stained glass” filters. There are many non-interference filters of similar types in use today for use in RGB imaging. Some will be comparable to the Bayer filters that  are built onto the OSC sensors. I will try to find the exact data and a few of the these to the spreadsheet filter selection. Use of those filters with a Mono camera will give results more similar to and OSC camera.

 

 

John


Edited by jdupton, 03 November 2019 - 01:34 PM.

  • bobzeq25 likes this

#21 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 36,565
  • Joined: 27 Oct 2014

Posted 03 November 2019 - 01:27 PM

On the contrary.

 

Let's go back to the earlier example of a rainbow.  If you image a rainbow using your sharp cut-off filters you'll see adjacent bands of solid red, solid green and solid blue with no continuous transition between them.  Yes, you can process it _any way_ you want but you'll never retrieve the smooth transitions of colour because the colour data is irretrievably damaged.

 

Mark

I disagree.  Information has been lost when a red photon registers on a "green" pixel.  Once it's been converted to an electron, it is no longer possible to distinguish that red photon from a green one.

 

You can warp the data to get a passable result, you cannot retrieve the information.

 

In your example, it is possible to blend the rgb data from interference filters in a way which mimics a continuous transition.  So, the use of rgb interference filters can do many things, because information has been preserved.  The sloppy Bayer filter is more limited.

 

It's somewhat similar to the color distortions inherent in a broadband light pollution filter.  Again, you can warp the remaining data to get a passable result, but some color information has been destroyed.  You can see it clearly in star colors on images taken through something like an IDAS LP filter.


Edited by bobzeq25, 03 November 2019 - 01:33 PM.


#22 bobzeq25

bobzeq25

    ISS

  • *****
  • Posts: 36,565
  • Joined: 27 Oct 2014

Posted 03 November 2019 - 01:43 PM


   I suspect from your postings in the past on this general subject that you are a “nebula guy” hence the statement “Astro targets usually have an uneven distribution of wavelengths emitted.”

 

   One final point I’ll add is that as a scientist, I do believe that anything can be modeled with mathematics.

 

 

If you look at my astrobin, you'll see I'm pretty target agnostic, it's one of my major characteristics.  Galaxies may be more broad spectrum, but not really as much as terrestrial.

 

As a scientist (comparing credentials would disintegrate into "mine is bigger than yours" <grin>), I know some things cannot be modeled.  The weather next month is an example, there are many more.   Decent book about the topic:

 

https://www.amazon.c...72806762&sr=8-3

 

This guy (I haven't read any of his books) seems to have built an academic career around the issue.

 

https://www.amazon.c...=ntt_dp_epwbk_0

 

Empirical methods may be suspect from car enthusiasts (as a car enthusiast who's been on the discussion forums, I well know this), but they have a place in science, and moreso, in imaging.  Things are _often_ ridiculously complicated, with a lot of tradeoffs.

 

Again, your analogy of the car/truck guys talking is more a debating tactic than illuminating.  OSC/mono plus interference filters is not a Ford/Chevy deal.


Edited by bobzeq25, 03 November 2019 - 01:53 PM.


#23 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 02:42 PM

Bortle2,

 

Yes -- in terms of the absolute time needed to collect certain number of photons. Say, an old color (Bayer) sensor had peak QE of 60% and you needed 3-hour exposure with it for some target. If... if the 1.17 coefficient is correct, one should be able to accomplish the same task in about 2.61 hours using monochrome counterpart. Now, a new color sensor with 90% peak QE should be able to do the same in just 2 hours (similar monochrome in about 1.74 hours, but that's not the point: exposure time difference does shrink).

 

Then there is the issue of image acutance, which this article illustrates pretty nicely. The difference in acutance between OSC/monochrome is very similar to that between Bayer and Foveon sensors in conventional cameras. But that's changing as well: 60Mp IMX455 has so many so densely packet pixels that vast majority of astrophotographers will be shooting them using 2x2 binning (or downsample in post), thus making sure each pixel has color information, effectively negating one of the benefits of monochrome+filters combos.

 

All that said, I do think that the 1.17 coefficient that the OP came up with is

 

1) a bit optimistic; a more strict approach has to take into account big differences between targets; perhaps a statistical approach including brightness analysis (to get a better real-world photon distribution) of a lot of actually shot R, G, and B frames for different targets would be more convincing; and it's IMHO likely to show that two G sub-pixels in each "super-pixel" is a deficiency,

 

2) needs to be verified using sensor (or rather sensors) for which manufacturer provides official QE curves; Sony is notoriously bad at that and, to the best of my knowledge, official QE curves for IMX183 (which is analyzed in the OP) are not available from Sony; whatever guesswork camera manufacturers then publish themselves is, well, of suspect accuracy, and sometimes is just a mess.

 

Bottom line (IMHO!): the OP is a very good first step, but a lot more has to be done for the results to be truly convincing.

 

   To be very clear, the 1.17x coefficient is not a generic number. It is extremely specific to the case where someone images LRGB with equal exposure for all filters – for example 1 hour L, 1 hour R, 1 hour G, and 1 hour B. If you change those ratios any at all, then the “coefficient” changes. I have confused a number of readers on that point. (Sorry about that...)

 

   As I showed in a previous reply to Anil a few posts back, if you were to image for 12 hours with 6 hours devoted to L and the other 6 hours evenly split between R, G, and B, then the efficiency ratio changes to 1.60x for a Mono-to-OSC comparison.

 

   I will need to run some simulations to see if binning changes the results. I don’t think it will change anything at the sensor level, particularly if you are down-sampling in post-processing. It possibly will change things a little once image scale is taken in account. I’m not sure in that aspect.

 

   You are correct that the spreadsheet does not now take the imaging target attributes into account. I will be adding that to the calculations in a future iteration.

 

   You are spot-on regarding sensor QE plot data. I wish there were more definitive sources. I am at the mercy of what can be found on the 'net. The only (semi)-official Sony data I have is for the smaller IMX249 sensor. Other data was digitized from vendor plots and is only as good as whatever their source was. Some of that data does not even agree on the same sensor.

 

   I had wanted to add the popular Panasonic MN34230 CMOS used in the ASI1600MM (and MC) cameras. However, what I found when searching show significantly different QE plots for that sensor. I suspected that some were simply plots from other sensors and were being used for illustration only.

 

   If I could find a MN34230 QE plot that had some authenticity to it, I would include that in my sensor selection list in the spreadsheet.

 

   The same goes for other sensors and filters. If you would like to see more sensors or filters added to the spreadsheet, just post a link to the wavelength vs QE / Transmission plot, and I can add it for use in calculations.

 

 

John


Edited by jdupton, 03 November 2019 - 02:48 PM.


#24 sharkmelley

sharkmelley

    Cosmos

  • *****
  • Posts: 8,225
  • Joined: 19 Feb 2013
  • Loc: UK

Posted 03 November 2019 - 02:46 PM

I disagree.  Information has been lost when a red photon registers on a "green" pixel.  Once it's been converted to an electron, it is no longer possible to distinguish that red photon from a green one.

 

You can warp the data to get a passable result, you cannot retrieve the information.

 

In your example, it is possible to blend the rgb data from interference filters in a way which mimics a continuous transition.  So, the use of rgb interference filters can do many things, because information has been preserved.  The sloppy Bayer filter is more limited.

 

It's somewhat similar to the color distortions inherent in a broadband light pollution filter.  Again, you can warp the remaining data to get a passable result, but some color information has been destroyed.  You can see it clearly in star colors on images taken through something like an IDAS LP filter.

Your ideas seem to be missing some crucial element of the eye's response to colour and how it is imitated by a OSC camera.  But I'm not quite sure where your misunderstanding is and it's also a diversion from the main thread, so I'll finish here.

 

Mark


Edited by sharkmelley, 03 November 2019 - 02:49 PM.


#25 jdupton

jdupton

    Fly Me to the Moon

  • *****
  • topic starter
  • Posts: 5,751
  • Joined: 21 Nov 2010
  • Loc: Central Texas, USA

Posted 03 November 2019 - 03:17 PM

Frank,

 

I think there are a few things to keep in mind in these comparisons - and they don't require too much detailed analysis:

  1. RGB, LRGB, and OSC all produce different final results - so it's hard to say which one is best in a quantitative way
  2. OSC is not as lossy as people think - and depending on the spectrum of the object being imaged, the win, in terms of photons gathered, can probably go either way.  They are quite comparable in terms of photons gathered
  3. The duplicate G pixel may not be such a bad thing since the eye is most sensitive there

Another thing is that most mono filters have a gap in them to help block out some light pollution lines - but I think most people have broadband light pollution and those gaps may only serve to lose good signal that could otherwise be gathered.

 

Also - since OSC filter bandwidths intentionally overlap to mimic the eye's response, I wonder if it wouldn't be helpful to have a similar partial overlap of mono filters.

 

Finally - if you want mono and you want to capture many photons - you can do what I do, which is use Sloan filters to cover a much wider region of the spectrum and with no gaps.  It ends up being a form of false color - but if it's photon counts you want, then one way is to expand into the UV and IR.

 

   Thanks for joining in the discussion here. All of this started for me several months back due some of your comments in a couple of other threads where you brought up the efficiencies of Bayer filters and the effects of the gaps in the AstroDon e-Series vs i-Series filter sets.

 

   I agree with your comments regarding OSC not always being as lossy as most may think. One motivation for this thread and the modeling spreadsheet was to get more folks to look closely at objective mathematical comparisons rather than taking statements for granted. The “OSC is only ¼ as sensitive since it only uses ¼ of the pixels” misconception and the “external filters pass nearly 100% or their light whereas Bayer filters only pass 30% of their light” misunderstanding of existing data always irked me to various extents.

 

   On the subject of gaps in the cumulative filter pass-bands, this can be easily see with the spreadsheet I posted. Just select the sensor of your choice and then select the AstroDon eRGB or AstroDon iRGB cumulative filter entries and you can immediately see the relative size of those gaps and what you might be losing out on by picking one series over another.

 

   It is also informative to see that there is overlap in the pass-bands between the blue and green for even the AstroDon interference filter sets. Many who don’t like that the blue and green Bayer filters on an OSC have overlap may have never really noticed the details of some of the other external filter sets they use. “Color Contamination” isn’t limited to Bayer filters.

 

   Plus one to having a set of Sloan Filters. I have not used my AstroDon Sloan set in a while but when I first started with my monochrome camera, that was the first set I purchased. It was also the reason I had to build my own multi-spectral DIY flat light source. I quickly tired of taking multi-minute long flats for u' and i' pass-bands using other methods.

 

 

John


  • psandelle likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics





Also tagged with one or more of these keywords: astrophotography, CMOS, ccd



Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics