M33 - LRGBHa Data for Processing - No Copyright
Posted 14 November 2012 - 02:46 PM
So here it is, free for you to download, process and publish however you see fit!
What is available
4 uncropped master files, calibrated with darks/bias/flats
L - Mean M33 Lum(Registered).FIT size 32,544KB
R - Mean M33 Red(Registered).FIT size 32,544KB
G - Mean M33 Green(Registered).FIT size 32,541KB
B - Mean M33 Blue(Registered).FIT size 32,544KB
All subs were unguided with nice round stars.
Luminance 55 subs 300s binned 1x1 - 4.58 hours
Red 13 subs 300s binned 2x2 - 1.08 hours
Green 15 subs 300s binned 2x2 - 1.25 hours
Blue 15 subs 300s binned 2x2 - 1.25 hours
Ha 15 subs 300s binned 1x1 - 1.25 hours
Dark Master binned 1x1 (50 1800s frames)
Dark Master binned 2x2 (40 300s frames)
Bias Masters 1x1 and 2x2
Flat Masters (30 subs for each filter)
How to get the data
FTP - FTP://ai.kopacz.com
When you log in, change your working directory to M33.
Posted 14 November 2012 - 08:47 PM
I do not own a mono CCD with filters and this gives me the opportunity to learn how to process long before I acquire the equipment to do so.
Posted 14 November 2012 - 08:59 PM
I believe the trick is to process Luminance and RGB separately and as equally as possible. For example, I use the exact same Dynamic Background Extraction (DBE) data points for both Luminance and RGB so the background would be equally processed.
I uploaded my final process of your M33 in FIT format for you to analyze what I did. It's huge at 86MB!!!!
Do not click on any "DOWNLOAD" or "PLAY NOW" buttons. Click only on "Click here to start download from sendspace" link at bottom.
Maybe you should start your own contest. You capture the data, we process your data, you judge the winner and award some prize like give away your mount.
EDIT: I forgot to mention that I didn't include Ha because it was too dim. I think the sub-exposure needs to be significantly increased to as much as 30 minutes.
Posted 14 November 2012 - 10:38 PM
Posted 14 November 2012 - 11:15 PM
I have imaged this galaxy before when I was first learning the mount and the Galaxy was "whole". It almost looks like a dust lane running through the galaxy from 11 o'clock to 5 o'clock on the left hand side, but it really looks more like missing data.
I noticed when I put my camera into the focuser that the filter wheel was positioned in a manner that 3/4 of it was covering the chip and a cresent moon sized opening was there.
I just assumed that the FLI filter wheel would synch and place the filters directly over the chip, but after examinging this image I am left to wonder if the edge of the galaxy is missing because the filter wheel is not properly synchronized with the opening to the camera?
Is this even possible? I have never seen anything in software to "home" the filter wheel.
Anyone know why it appears the left hand side of the galaxy is missing? And why am I the only one noticing it?
Eagerly awaiting comments.
PS: Nice job guys. I really don't mind sharing my data, so if you enjoy it I will make more available in the future.
Posted 15 November 2012 - 01:05 AM
Did the "chopped off" portion showed up in any of the flats?
I wonder if the purplish portion of my processed image at the bottom was caused by mis-placed filters as you described.
Posted 15 November 2012 - 01:57 AM
Thanks again for the data, Dave.
Posted 15 November 2012 - 09:31 AM
This was processed in PS CS4, I used Sal Grasso's plugin for intensifying the color in the arms and the core and it clipped a bit when I was working on the gradient, particularly in the reduced jpg image here. Really nice data though, the Luminance was super smooth.
Posted 15 November 2012 - 08:08 PM
Mike, I really like you version of the image. I never heard of ImagesPlus.
If you guys like this data, you should see what I captured on NGC 1579. The raw data is really awesome looking. I am going to try to process it for the November competition, but if I do a poor job, I'll release it for you guys to process.
I received great news today. I interviewed for an IT consultant volunteer position for the Lowell Observatory and Discovery Channel Telescope and was accepted as a staff member.
I believe I can make a significant contribution to the scientific research team as an IT volunteer.
Today was a great day!
Posted 15 November 2012 - 08:59 PM
I reprocessed your image again. I selected better DBE data points to remove the purplish color at bottom of image and other gradients. I also stretched a bit less and less color saturation boost. I think this is better than my first process but that's a personal preference. Your Luminance image made a HUGE difference.
Congrats on the volunteer position at Lowell Observatory.
Posted 16 November 2012 - 11:40 AM
Thanks again for sharing your data and I hope you like this image. Maybe my results will help answer your “chopped off” question. I don’t shoot monochrome but it doesn’t sound right if a filter is hanging over the chip. However I don’t think this is what was causing your concern about missing data. I, like everyone else notice a gradient in the left corner which overlapped M33. I almost stopped processing thinking your filter interfered with data acquisition. I am glad I continued because it looks to me like there is a dark dust lane that might be diffusing light on that side. If you try to imagine that lane not being there and look past it, Triangulum looks perfectly round.
Dave’s larger M33
You can click the image then in View select "Full Resolution"
Keep in touch….
Posted 16 November 2012 - 11:53 AM
You're full resolution version looks quite nice. Thank you for sharing it with me.
I went back and looked at my M33 image from my trip to Colorado last year and I can see that dark dust lane and how it sort of appears to cut off the edge of the Galaxy.
Last Year's M33 Image
I captured much less data then (LRGB 35/20/20/15) and it wasn't as pronounced. I guess since I captured so much more light this time it is far more noticeable and that's why I though something must be wrong.
Since I noticed the filter wheel wasn't centered over the chip when I insterted the camera and lens into the focuser this setup, I naturally and mistakenly assumed that was causing the data to look "chopped off".
I ma curious about the gradient. I have no light pollution here to speak of, so I can't imagine how gradient got into the data.
Perhaps I need to post my master flats and have someone take a look at them. Is it possible that I introduced the gradient in calibration and stacking with CCDStack?
I can't imagine how.
Posted 16 November 2012 - 11:58 AM
Great job capturing this galaxy.
Posted 16 November 2012 - 12:50 PM
One day, I will learn to process these images well, but until such time I really don't mind sharing the data with others having less cooperative weather and sky conditions.
I do realize I am most fortunate to have been able to pick up and move to Arizona. Not everyone can do that.
Anyway, your second image looks really good. The fiurst one looked over processed and saturated, but this recent one looks much better.
Thanks again for sharing your work. Sharing the results, while desireable, is certainly not a requirement for me to provide the data to the public.
Posted 16 November 2012 - 01:05 PM
I over processed the first image because I was trying to remove the bright washed out layer on top of the galaxy. It was not easy to do. It got easier the second time.
Thanks again for sharing your data. It helped me greatly to understand the art of processing LRGB and now I think I know what to do when capturing my own images when the weather cooperates. We've been having awful weather here in Reno, NV for a long time. For example when I first got my mono camera with LRGB, Ha, Oiii and Sii filters, it took me almost THREE months to gather data all 7 filters for one DSO!!!!
Posted 16 November 2012 - 07:45 PM
In reference to Blueman's Horsehead Nebula thread regarding to synthetic (pseudo) Luminance extracted from RGB, I did the same thing with your RGB and extracted Luminance from your RGB and created pseudo Luminance. The results was amazingly similar. I did not use your Luminance image to process this image. I found using Sythetic Luminance is not quite as sharp as with real Luminance.
This does not mean you should not capture with Luminance filter. I just found this processing method interesting alternative. It may be good for me because I have very limited clear sky and if I capture just enough RGB and no time to capture with Luminance filter, then Synthetic Luminance may work well.
Posted 16 November 2012 - 10:54 PM
I remember something about Adam Block showing that one could take RGB data and apply a heavy gaussian blur to it and when layered with the crisp 1x1 binned luminance data would produce a beautiful color image.
I don't think the same is true for the reverse using "synthetic" luminance from 2x2 binned RGB data.
First, it doesn't have the same resolution. Secondly, it usually doesn't have the same depth as I typically capture luminance data with longer exposures.
I suspect it's better than no luminance but I cannot imagine it is as good.
Why didn't you use my luminance data? Just for testing purposes of the "synthetic" theory?
Posted 16 November 2012 - 11:13 PM
Yes, it was for testing purpose to compare "real" Luminance with "Synthetic" Luminance. To an average person, it was difficult to tell the difference but when I was comparing the two by blinking in between the two, "real" Luminance was the winner.
It's possible if the original RGB subs were capture in binned 1x1, the result would have been different but you would have to capture significantly more RGB subs to nearly match with "real" Luminance. That possibly explains why Synthetic Luminance was not as sharp due to RGB being captured in binned 2x2.
Posted 16 November 2012 - 11:45 PM
It's me again. This process includes Ha. The image appears more bluish and the red spots appear redder. Processing is not an exact science. It's mostly eyeballing and estimation. In other words, it's always difficult to re-process the same data and get same results. For example, it's easy to over or under stretch and get different results.
I think Ha is under exposed especially for a galaxy containing very few Ha data. In the future, I suggest exposing close to 30 minutes for any narrow band especially on a galaxy. The background is so low and can be lower than read noise and that can make it difficult to process. I checked the ADU value of the background of Ha and the typical value is at a whopping low value of between 1 and 30 which is most likely lower than the read noise in your camera. Normally you want the background to dominate the read noise so that processing is easier. The solution would be to expose long enough for the background to dominate the read noise of your camera.
The minimum ADU for your ALL of your LRGB and Ha is -33333 like your Blue image of Bubble Nebula. I think I know why. The black borders around the image due to images taken on different nights or dithering and stacking the subs that results black borders. The black borders reads at -33333 which means the images need to be cropped to exclude the black borders. Bottom line there was nothing wrong with the blue image from Bubble Nebula in the first place, it's how the software stacked the subs and naturally resulted small black borders that caused the software to think the minimum value at the black border to read at -33333. So finally problem solved for this.
Posted 16 November 2012 - 11:54 PM
Should I crop the images just before I stack them, or after?
Are you going to finish processing the bubble now that the blue data is ok?
I am going to attempt to process my NGC 1579 data this weekend. After tonight, I'll have about 5 hours of luminance data and about 1.5 hours each of RGB.
The data looks great.
Posted 17 November 2012 - 12:10 AM
I will look at your Bubble maybe tomorrow.
Thanks again for providing excellent data.
Just follow Harry's great PixInsight tutorials when processing NGC 1579. That's how I learned it. PixInsight also has a very good tutorial on LRGB processing at their web site. Harry's tutorial has two or three different LRGB processing tutorials and all of them are done a little differently. So it depends on the DSO you capture.
Have you process M33 yet?
Posted 17 November 2012 - 01:00 AM
I have tried to follow both Harry's tutorials and the PixInsight tutorial and I always seem to get lost doing Histogram Transformation when I am supposed to drop the STF onto the HT tool. My histogram on the bottom Window never looks like their's. Instead of a sharp peak on the left hand edge, minae are always flat like a soft bump in the road.
I can't figure it out and quite frankly, I'm never going to be able to process images well until someone sits down with me and shows me how to do it.
I am to right brained for anything artistic. I see everything in grayscale, binary, 1's and 0's, black and white.
I have no artistic talent whatsoever.
I am extremely technical and I am an excellent troubleshooter. I even have experience writing software and designing hardware, but I can't draw, I can't envision art or colors well in my mind. My wife tyhinks it's kind of funny.
She keeps telling me... just do what you do honey... get the data and have fun!
Posted 17 November 2012 - 01:13 AM
I never drop the STF on HT tool. I don't recall seeing Harry's video doing that but I have seen PixInsight video do that. When I am ready to use HT, I reset the STF so that the image is no longer "stretched" and use HT tool to stretch the image. I can't explain in words but I use HT in iterative method by stretching a little bit and apply to the image and repeat until the image looks reasonably good.
I am also a software/hardware engineer but never an artist. It took me a while to figure out how graphics works. Once you get the hang of it, it will get pretty easy.