Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Could Jpegs be better than raw for astroimaging?

  • Please log in to reply
48 replies to this topic

#1 ianmorison43

ianmorison43

    Lift Off

  • -----
  • topic starter
  • Posts: 14
  • Joined: 30 May 2017

Posted 03 December 2019 - 09:34 AM

This appears stupid as we all know that a raw file captures more information than a Jpeg.  Yes but....

 

If the full exposure is made up of many (say more than 20) short exposures then the problems with Jpegs are greatly reduced.  If there is noise in the output of the senser (and there will be)  the effect of averaging many frames increases the effective bit depth.

 

Consider the possible 8-bit values of 220 and 221 and lets say the real value of a pixel level is 220.3 (requiring more that 8 bits to digitize).  If there was no noise in the pixel output one would always get the value 220 - just 8 bits.  If, however  there is noise in the output of the sensor the values that are saved in the Jpeg file will spread - certainly with values of 220 and 221 and most likely 219 and 222.   The noise in the signal is producing this scatter.  BUT, as the real value is nearer to 220, the average of many values will be closer to 220 than 221 and probably very close to 220.3.  The effect is that the  averaged result (in 16 or more bit accumulators within Deep Sky Stacker or Sequator) has more effective bits and the result in terms of bit depth may not be significantly less than if raw data were used.

 

Averaging many Jpeg frames also appears to reduce the artifacts that the Jpeg compression produces other wise I cannot see how I can get such good results just using Jpeg frames.

 

I have taken 50, 25 second, frames of the Hyades and Pleiades cluster with both Jpeg and raw using my Sony A5000 mirrorless camera and a superb Zeiss 45mm prime lens and processed them three ways: first just using the Jpeg frames, then the raw frames directly and finally using Tiffs derived from the raw frames using Adobe Lightroom.   The stacked results had the light pollution removed and were identically stretched in Adobe Photoshop.

 

I can only say that the Jpeg result looked far more appealing and, in particular, showed up the nebulosity surrounding the cluster far better.

 

I believe that in producing the Jpeg files from the raw data produced from the sensor, many cameras carry out some stretching of the data and that this might have helped produced the better result.  It could be that applying some stretching to each Tiff file that has been derived form the raw files might achieve a better result from the raw data.

 

One should always take both Jpeg and raw data when astroimaging if only to be able to quickly scan through all the frames and perhaps eliminate those that have suffered the passage of a plane or satellite.  

 

My suggestion is that one should also process the Jpeg frames  - you might be surprised.

 

Regards,

 

Ian   (I know that it appears that I am a newby but I rarely post on forums but have written two books on astroimaging.)

 

Hyades+Pleaides-small.jpg  

 


  • Mark Bailey and happylimpet like this

#2 happylimpet

happylimpet

    Skylab

  • *****
  • Posts: 4,095
  • Joined: 29 Sep 2013
  • Loc: Southampton, UK

Posted 03 December 2019 - 10:06 AM

Interesting and I have to agree with many of your points after a quick read...you also went on to mention most of the caveats I was thinking of so I dont have much to add! But an interesting thought.



#3 sg6

sg6

    Fly Me to the Moon

  • *****
  • Posts: 6,363
  • Joined: 14 Feb 2010
  • Loc: Norfolk, UK.

Posted 03 December 2019 - 10:35 AM

I would say they may not be better then RAW, but after a point there is likely little in it. Throw in that many of us sort of take "snaps" of the sky and jpegs are likely a lot easier. The serious will I expect remain with raw data.

 

I noticed that DSS will stack jpegs and have said to people that and so start with jpegs as much just to get familiar with DSS as anything. Then try RAW, once some knowledge.

 

From the description I would say one approach I often suggest of 40 second exposure, 20 second wait and get 60 exposures fits in well with a load of jpegs - this could be fun now the smaller mount is up and running, well tracking.

 

Might be a project for a university student (guess MSc or PhD) to investigate and analyse. Would make a good project combining data, imaging, data processing and probably a few others subjects. Would like to see the results/findings.


  • Mark Bailey likes this

#4 Alen K

Alen K

    Apollo

  • -----
  • Posts: 1,130
  • Joined: 25 Nov 2009

Posted 03 December 2019 - 10:49 AM

I have taken 50, 25 second, frames of the Hyades and Pleiades cluster with both Jpeg and raw using my Sony A5000 mirrorless camera and a superb Zeiss 45mm prime lens and processed them three ways: first just using the Jpeg frames, then the raw frames directly and finally using Tiffs derived from the raw frames using Adobe Lightroom.  The stacked results had the light pollution removed and were identically stretched in Adobe Photoshop.

 

I can only say that the Jpeg result looked far more appealing and, in particular, showed up the nebulosity surrounding the cluster far better.

 

I believe that in producing the Jpeg files from the raw data produced from the sensor, many cameras carry out some stretching of the data and that this might have helped produced the better result.  It could be that applying some stretching to each Tiff file that has been derived form the raw files might achieve a better result from the raw data.

 

One should always take both Jpeg and raw data when astroimaging if only to be able to quickly scan through all the frames and perhaps eliminate those that have suffered the passage of a plane or satellite.  

 

My suggestion is that one should also process the Jpeg frames  - you might be surprised.

 

Regards,

 

Ian   (I know that it appears that I am a newby but I rarely post on forums but have written two books on astroimaging.)

In the text I made bold above, I think you have hit on it. The in-camera JPEG had a screen-transfer (tone) curve already applied, which you further stretched in Photoshop. The TIFFs from Lightroom probably also had a tone curve applied by Lightroom but it may not have been identical (likely wasn't, given your results). The exact tone curves used combined with the curve you used for further stretching can make an enormous difference in the visibility of low-level details such as in this case the reflection nebulae around the stars in M45. Removing light-pollution gradients before or after the application of a tone curve can also make a difference. In short, there are too many variables here to make this a definitive experiment. 

 

Regarding saving JPEGs as well as raw, I used to do that if only for preview purposes. But I have found that the image viewer I use for that (Irfanview 64 bit) is seemingly just as fast at rendering a screen image from raw as it is from JPEG. I'm sure in reality it is not quite as fast but I can't really tell the difference as I cycle through a set of images. 

 

Regarding the astroimaging books, is this one of them?

https://www.amazon.c...s/dp/1107619602

If so, what is the other one?

 

I found a book on astronomy and cosmology by Ian Morison. Is that yours too or is that by another Ian Morison? 

https://www.betterwo...a-9780470033333


Edited by Alen K, 03 December 2019 - 01:24 PM.

  • Mark Bailey likes this

#5 bobzeq25

bobzeq25

    Hubble

  • *****
  • Posts: 17,419
  • Joined: 27 Oct 2014

Posted 03 December 2019 - 10:50 AM

I wouldn't call that "stupid", but it's wrong.  jpgs are not better than RAW.  It's pretty simple.

 

As you note, jpgs are "stretched", ie not linear.  Some processing is better done with linear data, before stretching it has distorted it.  Gradient reduction is a notable example.  And, it's better to have stretching under your total control.  Yes, you can mess it up.  You can also do it significantly better then your camera, which was designed for very different terrestrial data.

 

Again, as you note, jpgs are also compressed in a way that destroys some information.  In astrophotography, with its horrible signal to noise ratio, you never have any of that to spare.  Much effort (such as long total imaging time) is devoted to getting more.

 

jpgs are "8 bit" data.  RAW is generally 12 or 14.  PixInsight even converts those, and 16 bit CCD fit files, to 32 bits, for more accuracy in processing.  The people who developed that program are very far from stupid or unknowledgeable.  The more bits you start with, the better.  8 is way too few.  The argument you make about averaging noisy data is wrong.  The average will _always_ be more accurate if you start with more bits, there's _no_ advantage in starting with less.  Simple math.  Adding extra zeros on the end (which is what PixInsight does to the data) has no disadvantage, the average will occupy them (somewhat).  Truncating data to fewer bits is disadvantageous.

 

Bottom line.  You can use jpgs, get images.  AP is powerful and resilient.  People have even tried (and failed) to promote methods that use them.  But there's no way they're "better" which is why RAW is still completely dominant.

 

See this book for a good discussion.

 

https://www.amazon.c...d/dp/0999470906

 

This website is decent.

 

https://x-equals.com...th-jpeg-vs-raw/

 

There is much more information available on this.


Edited by bobzeq25, 03 December 2019 - 11:12 AM.

  • Jon Isaacs, guyroch and harbinjer like this

#6 sg6

sg6

    Fly Me to the Moon

  • *****
  • Posts: 6,363
  • Joined: 14 Feb 2010
  • Loc: Norfolk, UK.

Posted 03 December 2019 - 11:48 AM

Can I suggest that the title is read as:

Could Jpegs be easier than raw for astroimaging?

rather then the title as it stands of:

Could Jpegs be better than raw for astroimaging?

 

Or did Professor Morison intend to create ripples ? shocked.gif shocked.gif

Astronomer can be conservative. grin.gif grin.gif

 

Any papers on this idea?

Preferably nice simple ones, no more then words of  syllables please. lol.gif lol.gif


  • Mark Bailey likes this

#7 ianmorison43

ianmorison43

    Lift Off

  • -----
  • topic starter
  • Posts: 14
  • Joined: 30 May 2017

Posted 03 December 2019 - 12:48 PM

Hi again,

   here is the comparison of the Pleiades cluster at 100% with the Jpeg version above.

 

Yes, they are my books and there are two other recent ones.  The relevant one includes the words Art and Astrophotography and the other Journey and Universe.

 

Re the title, I do think that in this case stacking the Jpegs gave a better result, so as I said it might well be worth stacking both Raw and Jpeg frames to see what result they give.  In this case the Jpeg result was better than raw so I stand by the title of the post.   Searching quite hard I have not found anyone who has made this point so one could say that it is not sensible.  But is it?  It takes no time to additionally process the Jpeg frames, so why not.

 

Incidentally, the Sony A5000 which can be bought second hand for less than $200 or £150 makes an excellent astrocamera.  It has a flip up screen that really helps when imaging near the vertical.  It was mounted on a 'Nanotracker' about the smallest sidereal tracker that it is possible to buy.  Along with lightweight tripod this is all I take to do imaging in the dark sky reserve in South Island, New Zealand. (When my rucksack is always taken apart at security as I leave the UK,)  With adapters, one can use legacy primes, such as my Contax G Zeiss 45mm lens that was used for this image.

 

Thanks for reading and replying.

 

Cheers,

Ian

Attached Thumbnails

  • Comparison.jpg


#8 Alen K

Alen K

    Apollo

  • -----
  • Posts: 1,130
  • Joined: 25 Nov 2009

Posted 03 December 2019 - 01:47 PM

This be you then: https://en.m.wikiped...iki/Ian_Morison

 

I don't have time to try it (besides which the images above are low-res JPEGs) but I think that further stretching of the original of the second image in the post immediately above, possibly with a black-point adjustment, could easily yield a similar amount of nebulosity and a similar limiting magnitude as the first image, quite possibly with less noise or posterization (I detect the latter in the first image). There is no theoretical reason I can think of that JPEG data would yield a better image than could be had with the original uncompressed data. The fact that you got such a result I think says more about the need for further -or different- processing steps on the original data. Lots of things are done to make an in-camera JPEG that you may not have done to the raw data. 



#9 ianmorison43

ianmorison43

    Lift Off

  • -----
  • topic starter
  • Posts: 14
  • Joined: 30 May 2017

Posted 03 December 2019 - 02:06 PM

Hi again.

 

Yes tis I.

 

Appended is the raw image of the Pleiades at 100% stretched to the ultimate. What more could I do?  If I raise the black point I get pretty much what was seen in the earlier post.

 

Cheers,

 

Ian

 

Tiff stretch.jpg



#10 Dwight J

Dwight J

    Gemini

  • *****
  • Posts: 3,373
  • Joined: 14 May 2009
  • Loc: Lethbridge, Alberta, Canada

Posted 03 December 2019 - 03:40 PM

In my experience, I can’t process an image as much if it is jpeg before artifacts start showing up. The only time I have taken jpegs has been doing time lapses or I forgot to select raw before imaging.  Doing EAA with my DSLR I sometimes use jpegs since I usually don’t save the output and this helps live stacking as the files are smaller.  



#11 Kendahl

Kendahl

    Apollo

  • *****
  • Posts: 1,468
  • Joined: 02 Feb 2013
  • Loc: Pinedale, Arizona

Posted 03 December 2019 - 06:16 PM

The problem with jpegs is that they are non-linear. They have already been stretched. That means darks, flats and gradient removal won't work properly. Then, there's lossy compression to degrade the image.


  • bobzeq25 likes this

#12 Alen K

Alen K

    Apollo

  • -----
  • Posts: 1,130
  • Joined: 25 Nov 2009

Posted 03 December 2019 - 07:26 PM

Appended is the raw image of the Pleiades at 100% stretched to the ultimate. What more could I do?  If I raise the black point I get pretty much what was seen in the earlier post.

I mentioned black level because it is not unusual to need to adjust it after a severe contrast stretch. When you say you stretched "to the ultimate," what do you mean? What did you do in what software? Ultimate means different things to different people. Many people (myself included) do stretching iteratively for best results. Some people (myself included) use color-preserving stretches. Personally, I think an "ultimate" stretch would use both.



#13 17.5Dob

17.5Dob

    Fly Me to the Moon

  • ****-
  • Posts: 5,424
  • Joined: 21 Mar 2013
  • Loc: Colorado,USA

Posted 03 December 2019 - 07:27 PM

Can a stack of JPEGs be better than the same stack of RAWs....NO....

You are correct that in stacking "enough" jpegs, you can recover some bit depth.... BUT...it works both ways. You might only start with 14 bit RAWs, but the bit depth increases likewise, as you stack , so it is virtually impossible for an equal number of JPEGs to even come near a stack of 14Bit RAWs.

Regarding the need to always shoot  RAW + jpeg, in the past 16+ years of shooting digital photos of any kind, I have only shot jpegs once, for a timelapse video I was putting together. There is no difference in speed in any photo editing program that I've ever used between viewing  a RAW or jpeg as RAW normally always carries an embedded jpeg ......just for viewing.... there is no need to convert a RAW file to view it......

(And by the way, you don't need to prescan your photos for satellites or airplanes to cull them anyway. That's what K-S averaging is for...)


Edited by 17.5Dob, 03 December 2019 - 08:15 PM.

  • Alen K, bobzeq25 and DubbelDerp like this

#14 Alen K

Alen K

    Apollo

  • -----
  • Posts: 1,130
  • Joined: 25 Nov 2009

Posted 03 December 2019 - 08:54 PM

There is no difference in speed in any photo editing program that I've ever used between viewing  a RAW or jpeg as RAW normally always carries an embedded jpeg ......just for viewing.... there is no need to convert a RAW file to view it......

I had forgotten that. It explains why the photo viewer I use seemed to be just as fast with raw files as JPEG files. It should be said that the embedded JPEG is typically not as high a quality (more compressed) as a seperately saved in-camera JPEG. So if one were to process an image starting from a JPEG, the one in the raw file (which can be extracted if desired) would be a poor choice. 



#15 17.5Dob

17.5Dob

    Fly Me to the Moon

  • ****-
  • Posts: 5,424
  • Joined: 21 Mar 2013
  • Loc: Colorado,USA

Posted 03 December 2019 - 11:27 PM

I had forgotten that. It explains why the photo viewer I use seemed to be just as fast with raw files as JPEG files. It should be said that the embedded JPEG is typically not as high a quality (more compressed) as a seperately saved in-camera JPEG. So if one were to process an image starting from a JPEG, the one in the raw file (which can be extracted if desired) would be a poor choice. 

Nikon's use a full size "Quailty 7" compressed jpeg as their embedded thumbnail.

But who's kidding who...compressed is compressed and 8 bits is not 14 bits....


  • bobzeq25 likes this

#16 sharkmelley

sharkmelley

    Mercury-Atlas

  • *****
  • Posts: 2,535
  • Joined: 19 Feb 2013

Posted 04 December 2019 - 02:09 AM

Hi Ian,

 

It's certainly an interesting idea to stack high quality 8 bit JPGs but it should never be better than stacking the TIFF files from Lightroom.  I don't understand why you should see any real difference between the two approaches because, all things being equal, the Lightroom TIFF should be very similar to the JPG.

 

You are correct in stating there is no information loss in the 8-bit JPG, assuming there is sufficient noise to dither the quantization steps in each colour channel.  This is a condition that generally holds true for long exposure DSO imaging.

 

My only criticism of using JPGs and Lightroom TIFFs is that the data representation is non-linear.  When you subtract light pollution from a stacked (or unstacked) non-linear image then it causes intense colour saturation and/or colour biases in the regions of low signal.

 

That being said, the advantage of using JPGs and Lightroom TIFFs is that they contain natural looking colour because the correct colour space conversions for the camera have already been performed.  It is far more difficult to apply the relevant colour space conversions to achieve the same natural colour when using a processing sequence based on stacking linear raw data.

 

Mark


Edited by sharkmelley, 04 December 2019 - 02:17 AM.

  • ChristopherBeere likes this

#17 ianmorison43

ianmorison43

    Lift Off

  • -----
  • topic starter
  • Posts: 14
  • Joined: 30 May 2017

Posted 04 December 2019 - 03:59 AM

Hi, my final thoughts on this - and thank you for all the interesting replies!

 

Of course the raw data must be better as seen in the comparison in the top two crops of the image.  The light pollution has not been removed (my technique to do this may be of interest and follows the main post). 

 

Following my thought that the better Jpeg result might be due to the tone curve applied in the Jpeg conversion in camera, I converted the raw files in Lightroom with a tone curve added  to lift up the fainter parts of the image.  When stacked and stretched, more stars were apparent in the field  but the nebulosity around the Pleiades (as seen in the lowest 100% crop) was still not shown as well as the result obtained when simply stacking the Jpegs.

 

It must be possible to produce a better image using the raw data  - but I do not know how to do it.

 

I would point out that my title for this post had the words 'Could' and was followed by a '?'.   I only posted it as I was really surprised at the results which I hope that you found interesting.

 

I do have some thoughts (not controversial) on the best ISO to use when imaging so might try to join in a post about this.

 

My regards to you all, and I trust that you have more cloudless nights that we appear to be having in Northern UK.

 

Cheers,

 

Ian

 

Light pollution removal in Photoshop.

 

Duplicate the image to make two layers.

Apply the 'Dust and Scratches' with a radius of, say, 35 pixels.   The filter thinks the stars are dust and removed virtually all of them.  In this case, Aldebaran and the Pleiades are still visible though blurred..

Clone out these areas from as close as one can and then apply a Gaussian Blur of, say, 35 pixels to give a very smooth representation of the light pollution.

Flatten the two layers using the 'Difference' blending mode. 

 

Final.jpg

 



#18 2ghouls

2ghouls

    Viking 1

  • *****
  • Posts: 858
  • Joined: 19 Sep 2016

Posted 04 December 2019 - 07:58 AM

I think the reason your final raw stack looks worse than your jpeg stack is the way in which you stretched it before applying your light pollution removal step. Note that they don’t look the same at all. For a better comparison, you shouldn’t use a custom stretch on the raw stack. Would you be interested in sharing your data (the original sub exposures and calibration frames if any)? I’d be interested in running a few experiments myself. If you’d rather not, I will try to remember to shoot both raw and jpeg next time I’m using a DSLR.

#19 Mike Spooner

Mike Spooner

    Vendor (mirrors)

  • *****
  • Vendors
  • Posts: 625
  • Joined: 06 Aug 2010

Posted 04 December 2019 - 10:04 AM

First let me say I know next to nothing about Astro processing but have been following several threads as I want to take and process some images for the fun of it. So my crazy question is: can jpegs and raw be stacked/combined at some point in the processing? 

 

Mike Spooner



#20 bobzeq25

bobzeq25

    Hubble

  • *****
  • Posts: 17,419
  • Joined: 27 Oct 2014

Posted 04 December 2019 - 10:01 PM

First let me say I know next to nothing about Astro processing but have been following several threads as I want to take and process some images for the fun of it. So my crazy question is: can jpegs and raw be stacked/combined at some point in the processing? 

 

Mike Spooner

Sure you can do anything.  But, there's absolutely _no_ advantage to using jpgs, period.

 

Using RAW exclusively is both easier and better in all regards, no matter how modest your goals are.

 

My strong recommendation for software for you is Astro Pixel Processor.  It both stacks and processes, is relatively easy to use.  Beats the heck out of using two programs and transitiong between them.


  • Mike Spooner likes this

#21 bobzeq25

bobzeq25

    Hubble

  • *****
  • Posts: 17,419
  • Joined: 27 Oct 2014

Posted 04 December 2019 - 10:06 PM

Hi, my final thoughts on this - and thank you for all the interesting replies!

 

Of course the raw data must be better as seen in the comparison in the top two crops of the image.  The light pollution has not been removed (my technique to do this may be of interest and follows the main post). 

 

Following my thought that the better Jpeg result might be due to the tone curve applied in the Jpeg conversion in camera, I converted the raw files in Lightroom with a tone curve added  to lift up the fainter parts of the image.  When stacked and stretched, more stars were apparent in the field  but the nebulosity around the Pleiades (as seen in the lowest 100% crop) was still not shown as well as the result obtained when simply stacking the Jpegs.

 

It must be possible to produce a better image using the raw data  - but I do not know how to do it.

 

I would point out that my title for this post had the words 'Could' and was followed by a '?'.   I only posted it as I was really surprised at the results which I hope that you found interesting.

 

I do have some thoughts (not controversial) on the best ISO to use when imaging so might try to join in a post about this.

 

My regards to you all, and I trust that you have more cloudless nights that we appear to be having in Northern UK.

 

Cheers,

 

Ian

 

Light pollution removal in Photoshop.

 

Duplicate the image to make two layers.

Apply the 'Dust and Scratches' with a radius of, say, 35 pixels.   The filter thinks the stars are dust and removed virtually all of them.  In this case, Aldebaran and the Pleiades are still visible though blurred..

Clone out these areas from as close as one can and then apply a Gaussian Blur of, say, 35 pixels to give a very smooth representation of the light pollution.

Flatten the two layers using the 'Difference' blending mode. 

 

attachicon.gif Final.jpg

Or, someone could get the Gradient xTerminator plugin and reduce light pollution gradients more easily and more effectively.  It's a good bit more sophisticated, behind the scenes.  Well worth the small amount of money (I have no commercial connection, use PixInsight myself).

 

http://www.rc-astro....entXTerminator/

 

Best for new folks is Astro Pixel Processor, as recommended above, which reduces gradients in the linear phase, before stretching.  Stretching distorts the data in a way that makes gradient reduction less effective (because it makes it harder to characterize the gradients).

 

More advanced folks use ABE and/or DBE in PixInsight, in the linear phase.  The RGB images below (taken with a one shot color CMOS camera, similar to a DSLR except it's cooled) show the effectiveness of ABE, which is very sophisticated in removing gradients while preserving dim data.  No other processing has been done, except a simple stretch to each, necessary to visualize the data.

 

APP is slightly better on some images, slightly wore on others.

 

Final version of the image here.  H alpha was added later in the processing.

 

https://www.astrobin...4117/G/?nc=user

 

A lot of very smart people have been doing this for a long time.  If ever there was something where following the "wisdom of crowds" makes sense, its AP of DSOs.

 

ABE exampl before.jpg

 

ABE example after.jpg


Edited by bobzeq25, 04 December 2019 - 10:17 PM.


#22 17.5Dob

17.5Dob

    Fly Me to the Moon

  • ****-
  • Posts: 5,424
  • Joined: 21 Mar 2013
  • Loc: Colorado,USA

Posted 04 December 2019 - 11:07 PM

Just to tag along to Bob's post, there has been "No Processing" done to these RAW image stacks. This is a screen grab from my monitor using  PI's basic "screen transfer function" that helps you visualize you what you might have lurking in the shadows. There has been no color correction, noise reduction, tone curves done. This is straight RAW data.


25 RAW subs...ISO 200, f6,5

49120503277_8af22858a2_b.jpg

20 RAW subs, ISO 200, f6.5

49120497122_683f30a94d_b.jpg

25 RAW subs, ISO 200,f6.5

49119807248_b4c1259b93_b.jpg
 


Edited by 17.5Dob, 04 December 2019 - 11:09 PM.


#23 DubbelDerp

DubbelDerp

    Ranger 4

  • *****
  • Posts: 307
  • Joined: 14 Sep 2018
  • Loc: Upper Peninsula of Michigan

Posted 05 December 2019 - 09:59 AM

I think you need to be really careful using a despeckle or dust/scratches filter in PS or GIMP as a way to remove light pollution. It might work ok on a starfield, but on an area with a lot of nebulosity, it's going to blast out the nebulosity thinking it's light pollution. I'm neither advanced nor sophisticated, and find Astro Pixel Processor to be invaluable for calibration, stacking, gradient removal, stretching, and initial pre-processing. With good enough data, that's all that's necessary for the most part.



#24 asanmax

asanmax

    Vostok 1

  • -----
  • Posts: 155
  • Joined: 17 Sep 2018

Posted 05 December 2019 - 12:33 PM

Hi, my final thoughts on this - and thank you for all the interesting replies!

 

Of course the raw data must be better as seen in the comparison in the top two crops of the image.  The light pollution has not been removed (my technique to do this may be of interest and follows the main post). 

 

Following my thought that the better Jpeg result might be due to the tone curve applied in the Jpeg conversion in camera, I converted the raw files in Lightroom with a tone curve added  to lift up the fainter parts of the image.  When stacked and stretched, more stars were apparent in the field  but the nebulosity around the Pleiades (as seen in the lowest 100% crop) was still not shown as well as the result obtained when simply stacking the Jpegs.

 

It must be possible to produce a better image using the raw data  - but I do not know how to do it.

 

I would point out that my title for this post had the words 'Could' and was followed by a '?'.   I only posted it as I was really surprised at the results which I hope that you found interesting.

 

I do have some thoughts (not controversial) on the best ISO to use when imaging so might try to join in a post about this.

 

My regards to you all, and I trust that you have more cloudless nights that we appear to be having in Northern UK.

 

Cheers,

 

Ian

 

Light pollution removal in Photoshop.

 

Duplicate the image to make two layers.

Apply the 'Dust and Scratches' with a radius of, say, 35 pixels.   The filter thinks the stars are dust and removed virtually all of them.  In this case, Aldebaran and the Pleiades are still visible though blurred..

Clone out these areas from as close as one can and then apply a Gaussian Blur of, say, 35 pixels to give a very smooth representation of the light pollution.

Flatten the two layers using the 'Difference' blending mode. 

 

attachicon.gif Final.jpg

I think the initial topic starter image is showing overstretched jpeg artifacts. I am personally not opposed to the theory of jpeg stacking being better over the RAW stacking but I would rather test this with a good lens/telescope with superb image quality. If you made the single jpeg and RAW frames you have available then, I'm sure, some of us would be able to process them and see if the theory is right.

I think there is a better way of removing the city light pollution by neutralizing the background of the stacked image in Photoshop to have the following values: Red:25, Green:25, Blue:25



#25 bobzeq25

bobzeq25

    Hubble

  • *****
  • Posts: 17,419
  • Joined: 27 Oct 2014

Posted 05 December 2019 - 01:01 PM

I think the initial topic starter image is showing overstretched jpeg artifacts. I am personally not opposed to the theory of jpeg stacking being better over the RAW stacking but I would rather test this with a good lens/telescope with superb image quality. If you made the single jpeg and RAW frames you have available then, I'm sure, some of us would be able to process them and see if the theory is right.

I think there is a better way of removing the city light pollution by neutralizing the background of the stacked image in Photoshop to have the following values: Red:25, Green:25, Blue:25

Hope this isn't tiresome.  The intent is simply to be informative. 

 

The best way to reduce the background light pollution visibility is a sophisticated gradient reduction tool.  They systematically analyze the data from the specific image, and work out how to do that best.  They're nowhere near as simple as either of the methods described above.

 

In DBE in PixInsight, sampling points are placed around the image in precise locations, based on the image.  You can do that automatically, and adjust parameters of the analysis.  You can do it manually.  Or a combination.

 

With StarTools WIPE does it mostly automatically, there are a few simple parameters you can tweak.


  • sharkmelley likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics