Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Sharpcap 4.0 released

  • Please log in to reply
49 replies to this topic

#26 Clouzot

Clouzot

    Viking 1

  • -----
  • Posts: 518
  • Joined: 09 Jul 2018
  • Loc: French Riviera

Posted 29 July 2021 - 04:39 AM

@GSBass incidentally I tried it yesterday on Jupiter. Quite handy to have that Firecapture-like feature, but a side effect: it throws Astrosurface’s planet disk detection off, so I couldn’t detect, align and stack.

#27 alphatripleplus

alphatripleplus

    World Controller

  • *****
  • Moderators
  • Posts: 125,366
  • Joined: 09 Mar 2012
  • Loc: Georgia

Posted 29 July 2021 - 07:35 AM

I was using SharpCap  Pro 4.0 last night, and it was the first time I really played with it  - I had ignored the beta version and was previously using vs3.2. I found a few improvements that were welcome to me - 1) the multi star focusing tool worked well (I could never get it to work for me in vs3.2); 2) I noticed that in the Live  Stack Alignment tab, there is now a checkbox for "Suppress hot pixels" which did not exist in the 3.2 vsn I was using. Still checking out other things, but so far so good.

 

In addition, the live stacking in both vs3.2 and v4.0 is more robust with SharpCap than ASILive. I've documented examples (usually very rich star fields) where ASILive will fail to live stack with a message " No matching triangles" and SharpCap has no such problems, but that is a topic for another thread. smile.gif


  • roelb and GSBass like this

#28 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 07:40 AM

I noticed that so used the surface detection instead and drew a box around Jupiter, it worked…. But in the future I most likely will just turn it on to focus and then turn it back off to capture

@GSBass incidentally I tried it yesterday on Jupiter. Quite handy to have that Firecapture-like feature, but a side effect: it throws Astrosurface’s planet disk detection off, so I couldn’t detect, align and stack.



#29 MarMax

MarMax

    Surveyor 1

  • *****
  • Posts: 1,547
  • Joined: 27 May 2020
  • Loc: Los Angeles

Posted 29 July 2021 - 02:56 PM

This may not be the place for a dumb new user question but I was using V4 Pro last night and just could not figure out the Sensor Analysis or Live Stacking. It appears to support the Neptune-CII camera. Rig was the C11 with f/3.3 reducer (probably at and actual of about f/4).

 

I went through all the steps for Sensor Analysis during the day and followed the instructions and it just seemed like an infinite loop of bit depth, e/ADU, linearity, gain, darks, gain and binning, etc. What is the endpoint of this process and how does the app save this information for the camera? Seemed kind of pointless unless it's automatically saving and remembering it.

 

As for the Live Stacking, it never really seemed to provide improved images over a single one. I tried lots of objects and really wanted to see something with M51. Even M13 was a bust. Based on what I got last night I can take better smartphone images. I know the f/4 setup works because I got some decent Moon images with it last time out but the first deep sky adventure was a bust.



#30 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 04:40 PM

Sensor analysis is definitely worthy, it does save the info internally and when you get ready to image you click on the histogragm tab, if you did sensor analysis then you will see a tiny brain light up, click on it and then click calculate… it will analyze the object you are point at and compare the data to your sensor and recommend settings… you just click apply and your ready…. This will help you with your live stacking issues too 

This may not be the place for a dumb new user question but I was using V4 Pro last night and just could not figure out the Sensor Analysis or Live Stacking. It appears to support the Neptune-CII camera. Rig was the C11 with f/3.3 reducer (probably at and actual of about f/4).

 

I went through all the steps for Sensor Analysis during the day and followed the instructions and it just seemed like an infinite loop of bit depth, e/ADU, linearity, gain, darks, gain and binning, etc. What is the endpoint of this process and how does the app save this information for the camera? Seemed kind of pointless unless it's automatically saving and remembering it.

 

As for the Live Stacking, it never really seemed to provide improved images over a single one. I tried lots of objects and really wanted to see something with M51. Even M13 was a bust. Based on what I got last night I can take better smartphone images. I know the f/4 setup works because I got some decent Moon images with it last time out but the first deep sky adventure was a bust.


  • brentknight and MarMax like this

#31 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 04:44 PM

Btw you need to run sensor analysis for both u bit mode and 16bit, it’s done when it shows the table on the right, you can then change to 16bit and run it again, data is automatically stored in the program til it’s run again, nothing else to do

This may not be the place for a dumb new user question but I was using V4 Pro last night and just could not figure out the Sensor Analysis or Live Stacking. It appears to support the Neptune-CII camera. Rig was the C11 with f/3.3 reducer (probably at and actual of about f/4).

 

I went through all the steps for Sensor Analysis during the day and followed the instructions and it just seemed like an infinite loop of bit depth, e/ADU, linearity, gain, darks, gain and binning, etc. What is the endpoint of this process and how does the app save this information for the camera? Seemed kind of pointless unless it's automatically saving and remembering it.

 

As for the Live Stacking, it never really seemed to provide improved images over a single one. I tried lots of objects and really wanted to see something with M51. Even M13 was a bust. Based on what I got last night I can take better smartphone images. I know the f/4 setup works because I got some decent Moon images with it last time out but the first deep sky adventure was a bust.


  • MarMax likes this

#32 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 04:51 PM

One more thing… the most changes in what you see on screen when live stacking actually occurs very quickly… but after that what is happening is an improvement in signal to noise ratio… so fine detail just start to get recorded…. It’s very obvious in post processing but no doubt the image does improve over time…. Your just not going to be able to see that improvement from one frame to the next..come back in 30 minutes though and you can tell

This may not be the place for a dumb new user question but I was using V4 Pro last night and just could not figure out the Sensor Analysis or Live Stacking. It appears to support the Neptune-CII camera. Rig was the C11 with f/3.3 reducer (probably at and actual of about f/4).

 

I went through all the steps for Sensor Analysis during the day and followed the instructions and it just seemed like an infinite loop of bit depth, e/ADU, linearity, gain, darks, gain and binning, etc. What is the endpoint of this process and how does the app save this information for the camera? Seemed kind of pointless unless it's automatically saving and remembering it.

 

As for the Live Stacking, it never really seemed to provide improved images over a single one. I tried lots of objects and really wanted to see something with M51. Even M13 was a bust. Based on what I got last night I can take better smartphone images. I know the f/4 setup works because I got some decent Moon images with it last time out but the first deep sky adventure was a bust.


  • MarMax likes this

#33 RazvanUnderStars

RazvanUnderStars

    Apollo

  • *****
  • Posts: 1,358
  • Joined: 15 Jul 2014
  • Loc: Toronto, Canada

Posted 29 July 2021 - 07:11 PM

Something must have gone wrong - next time you try check the log tab of the live stacking window to confirm stacking is taking place (I suspect it did not, hence the lack of difference over a single frame). There should be very visible noise reduction in the dark areas, especially initially. The signal grows linearly with the number of frames, the noise with the square root of that number, so the SNR keeps improving). Best to take a screenshot of the entire SharpCap window, so that we can see the exposure settings as well.

 

The other thing is that you need to adjust the histogram (try the auto-stretch button before refining manually).

 

 

 

As for the Live Stacking, it never really seemed to provide improved images over a single one. I tried lots of objects and really wanted to see something with M51. Even M13 was a bust. Based on what I got last night I can take better smartphone images. I know the f/4 setup works because I got some decent Moon images with it last time out but the first deep sky adventure was a bust.


  • GSBass and MarMax like this

#34 MarMax

MarMax

    Surveyor 1

  • *****
  • Posts: 1,547
  • Joined: 27 May 2020
  • Loc: Los Angeles

Posted 29 July 2021 - 07:49 PM

I really appreciate the tips so thank you GS and Raz.

 

I ran the sensor analysis for RAW8 and RAW16, just guessed on this. It did complete with the table of values which I copied to Notepad files but did not really know what to do with them. Good to know the app saves this and I'll check for the brain icon lighting up in the histogram next time.

 

So after clicking the brain icon and applying, if I want to do further histogram adjustments I should first try the auto stretch button.

 

I'm pretty sure I was live stacking because frames stacked was increasing as was the total exposure time. I never did go beyond 5 minutes total exposure time. And my Alt-Az mount was having a bad night so I could see the stacking area border changing where the app was following the object. It's just the overall object was not improving much. M13 was my reference because it was the brightest. M92 was not bad (looked better than M13) but I'll try these tips next time out and see how it goes.



#35 RazvanUnderStars

RazvanUnderStars

    Apollo

  • *****
  • Posts: 1,358
  • Joined: 15 Jul 2014
  • Loc: Toronto, Canada

Posted 29 July 2021 - 08:09 PM

There's no rush with the sensor analysis or the Brain, you can leave them for later. Keep it simple. To see if stacking works, you can begin with a medium gain and exposures of, say 15-30s and adjust the histogram.

 

I've just noticed your location is LA. If you observe from there (or any large city, I live in one as well), 5 minutes is not a lot of time because the light from the target is nearly swamped by the light pollution, so it takes a lot of time for the useful signal to accumulate over the latter. Read https://skyandtelesc...its-dark-skies/ to get a quantitative sense of how much more difficult it is from a large city (I live in one as well). Also make sure you have good focus, otherwise the light gets spread onto more pixels than necessary, reducing the signal in each.

 

It's also useful to save the stacked image, in case you have questions. SharpCap has a few options there, well explained in the help. If you want, you can save the individual frames as well for later analysis. 

 

If you have questions, it's probably better to post them in the EAA forum here on CN, so that we don't sidetrack too much the original topic of this thread.

 

Hope this helped,

Razvan


  • MarMax likes this

#36 MarMax

MarMax

    Surveyor 1

  • *****
  • Posts: 1,547
  • Joined: 27 May 2020
  • Loc: Los Angeles

Posted 29 July 2021 - 08:29 PM

Very helpful and I'll stop contaminating this thread. wink.gif



#37 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 08:31 PM

My Az mount acted up the other night too, you can clearly see field rotation immediately instead of the usual 30 minutes, does not happen real often but my mount does seem possessed sometimes…. And btw don’t think your missing something major by not having live stacking give you confidence…. It’s a great program but it is hard…. Having two histograms is what messes me up the most but eaa folks that don’t care about actually saving for post processing actuall stretch both and manipulate both to. Get a pleasing image on screen…… and just because you achieve a nice image on screen it’s not an intuitive match to believe your capturing data correctly for post processing…. It’s practically unrelated

I really appreciate the tips so thank you GS and Raz.

 

I ran the sensor analysis for RAW8 and RAW16, just guessed on this. It did complete with the table of values which I copied to Notepad files but did not really know what to do with them. Good to know the app saves this and I'll check for the brain icon lighting up in the histogram next time.

 

So after clicking the brain icon and applying, if I want to do further histogram adjustments I should first try the auto stretch button.

 

I'm pretty sure I was live stacking because frames stacked was increasing as was the total exposure time. I never did go beyond 5 minutes total exposure time. And my Alt-Az mount was having a bad night so I could see the stacking area border changing where the app was following the object. It's just the overall object was not improving much. M13 was my reference because it was the brightest. M92 was not bad (looked better than M13) but I'll try these tips next time out and see how it goes.



#38 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 09:08 PM

Just wanted to elaborate on what I said…. I think Robin knows his program is used for so many different things that it’s the reason he has so many different save options, you can save what you create on the screen, it is created from the underlying data that is being stack, so if you save the stack normally then you basically have to redo any adjustments you made…. Which might be successful or might not depending on your skills and the post processing program you use….. bottem line is what your actually capturing equates to a black image til you do all the stretching and adu adjustments etc…. Those tasks have to be done regardless of whether your doing it live or in post processing….it just gets confusing seeing one photo on the screen which is completely unrelated to the photo you get during a normal save….. ie the moral of this story is you could start live stacking and if post processing is your only aim, then you could do absolutely nothing to make your image visible on screen….. the data for post processing will be exactly the same whether you sit there and manipulate it or go in the house and have a beer…. If you want to manipulate live then there is a whole nother save function to record those adjustments


Edited by GSBass, 29 July 2021 - 09:16 PM.


#39 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 29 July 2021 - 09:30 PM

One more thing…. Clicking on the stretch on both histograms does give you confirmation that your exposure, and gain and color profile is correct though….. so if you do that and you see a problem, it’s best to close live stacking completely and revisit your exposure gain contols and perhaps your RGb sliders in the right side and look at your histograhm on the right…..then you can start live stacking again, you don’t have to do anything else to capture the dat for post processing, you can just come back in an hour or two and save it. But it is fun to sit out here and manipulate it withe the understanding that If your planning on post processing then don’t manipulate the controls on the right any further…. Just use the left histogram in live stacking, it won’t effect the data saved, it only effects what you see on the screen…. I hope I’m explaining well



#40 MarMax

MarMax

    Surveyor 1

  • *****
  • Posts: 1,547
  • Joined: 27 May 2020
  • Loc: Los Angeles

Posted 29 July 2021 - 10:28 PM

Now that I've watched this video a couple of times what you are saying makes more sense.

 

Nice beginner tutorial for SharpCap.

 

It was linked in another post but it's really a good start. I know you suggested the "brain" and I did fiddle with the sensor analysis but this video just follows the basics to set up your camera gain and exposure to get the histogram in the right place (the one on the right side).

 

Once you are set and ready to attempt a stack you don't fiddle with that histogram any more and use the one on the bottom. And I get what you are saying relative to the file save options (as 16 bit, as 32 bit, with adjustments, exactly as seen).

 

I'm ready to give it another try in the field but I'm looking at two nights of clouds before it clears up. I was going to post some questions and a couple of the images but decided after the tutorial to delete everything and try again before asking more questions.


Edited by MarMax, 29 July 2021 - 10:45 PM.

  • GSBass likes this

#41 nic35

nic35

    Apollo

  • *****
  • Posts: 1,251
  • Joined: 08 Sep 2007

Posted 30 July 2021 - 08:02 AM

MM

 

see post #91 here https://www.cloudyni...al#entry9581288 for a link to the unofficial sharpcap quick start guide.  A little dated, b ut still good.

 

john


  • MarMax likes this

#42 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 30 July 2021 - 08:23 AM

I’ve read it…. Not sure why the logic of the program has taken so long to sink in but I think I’m starting to get it….. I can’t understand photoshop tutorials either because my brain just does not work that way lol…. I love doing digital art but the programmers that write digital art programs are from a distant planet smile.gif…… or perhaps I am

———

I think in my mind if I had written sharpcap there would have been two live stacking modes, one for eaa so that you know everything you do is wysiwyg, and live stacking imaging mode, still wysiwyg but emphasizing that what ever changes you make on screen is affecting the captured frames….. it the two logics running concurrently that messes me up

——-

probably could narrow down this further by just stating if you had a imaging histogram and a eaa histogram and your view on screen just reflected the data from one or the other then that would simplify things….. it the part of one affecting the other that is harder to grasp…. You get to a point where you have no idea what your capturing in the imaging histogram because you cant see it with out shutting down live stacking altogether… and then of course you have no stacking and just see a very dark single frame image

MM

 

see post #91 here https://www.cloudyni...al#entry9581288 for a link to the unofficial sharpcap quick start guide.  A little dated, b ut still good.

 

john


Edited by GSBass, 30 July 2021 - 08:48 AM.

  • MarMax likes this

#43 StarCurious

StarCurious

    Viking 1

  • *****
  • Posts: 570
  • Joined: 24 Dec 2012
  • Loc: York Region, Ontario

Posted 30 July 2021 - 12:37 PM

I was using SharpCap  Pro 4.0 last night, and it was the first time I really played with it  - I had ignored the beta version and was previously using vs3.2. I found a few improvements that were welcome to me - 1) the multi star focusing tool worked well (I could never get it to work for me in vs3.2); 2) I noticed that in the Live  Stack Alignment tab, there is now a checkbox for "Suppress hot pixels" which did not exist in the 3.2 vsn I was using. Still checking out other things, but so far so good.

 

In addition, the live stacking in both vs3.2 and v4.0 is more robust with SharpCap than ASILive. I've documented examples (usually very rich star fields) where ASILive will fail to live stack with a message " No matching triangles" and SharpCap has no such problems, but that is a topic for another thread. smile.gif

I also found the Multi-star FWHM focus assistant very useful. I found it more precise than using Bahtinov mask.  I have Bortle 8 skies.  With a v1.1 ASI 224MC, I checked "Suppress hot pixels".  The Brain recommended just under 2 sec exposure with 61 Gain, while using Optolong UHC filter.


  • roelb and GSBass like this

#44 GSBass

GSBass

    Mercury-Atlas

  • *****
  • topic starter
  • Posts: 2,554
  • Joined: 21 May 2020
  • Loc: South Carolina

Posted 30 July 2021 - 01:05 PM

I love the brain… sometimes you just want to confidently get to work without experimenting…. The brain has been very reliable at getting the settings correct, I have not tried the new focus assistant yet…. I think I just assumed it would be bad since just about all the other focus assistants did not work well for me…. Still using bak mask

I also found the Multi-star FWHM focus assistant very useful. I found it more precise than using Bahtinov mask.  I have Bortle 8 skies.  With a v1.1 ASI 224MC, I checked "Suppress hot pixels".  The Brain recommended just under 2 sec exposure with 61 Gain, while using Optolong UHC filter.



#45 StarCurious

StarCurious

    Viking 1

  • *****
  • Posts: 570
  • Joined: 24 Dec 2012
  • Loc: York Region, Ontario

Posted 30 July 2021 - 01:52 PM

I love the brain… sometimes you just want to confidently get to work without experimenting…. The brain has been very reliable at getting the settings correct, I have not tried the new focus assistant yet…. I think I just assumed it would be bad since just about all the other focus assistants did not work well for me…. Still using bak mask

I have indoor remote control over motor focuser and electronic filter wheel (, and mount).  I would slew to an object, change a filter, use MS FWHM focus assistant, run and re-center with Plate Solve, run the Brain and apply recommended settings, change to dark filter, take dark, switch back to whichever filter, set preprocess subtract dark, then start Live Stack and adjust histogram.  



#46 Astrojedi

Astrojedi

    Fly Me to the Moon

  • *****
  • Posts: 5,714
  • Joined: 27 May 2015
  • Loc: SoCal

Posted 30 July 2021 - 01:53 PM

The one feature I have really been waiting for in SharpCap is a good autofocus routine. Right now when doing EAA or scientific imaging I have to go to NINA to run autofocus. Everything else I can do within SC.


  • StarCurious and roelb like this

#47 Astrojedi

Astrojedi

    Fly Me to the Moon

  • *****
  • Posts: 5,714
  • Joined: 27 May 2015
  • Loc: SoCal

Posted 30 July 2021 - 01:56 PM

MM

 

see post #91 here https://www.cloudyni...al#entry9581288 for a link to the unofficial sharpcap quick start guide.  A little dated, b ut still good.

 

john

Thanks John. I need to update (or just rewrite) this. So much to do so little time.


  • MarMax likes this

#48 MarMax

MarMax

    Surveyor 1

  • *****
  • Posts: 1,547
  • Joined: 27 May 2020
  • Loc: Los Angeles

Posted 30 July 2021 - 02:21 PM

Thanks John. I need to update (or just rewrite) this. So much to do so little time.

It's a great user guide and I was going to ask if you planned to do an update. Totally understand regarding too many things and too little time.


  • Astrojedi likes this

#49 roelb

roelb

    Surveyor 1

  • *****
  • Posts: 1,831
  • Joined: 21 Dec 2013
  • Loc: Belgium, Antwerp

Posted 30 July 2021 - 06:43 PM

Every now and then their is a discussion about the use of the "mini display histogram" and the "live stack histogram".

I always follow this procedure:

- to display and center the object in the FOV I auto stretch the "mini display histogram"

- when object is found and centered I reset the "mini display histogram", set exposure time/gain

- start live stacking

- one can start with the "auto stretch"/"auto color" the "live stack histogram"

- then manual adjust as you like

- "Save with Adjustments" (this is the image that you see at the live stacking screen)

I found that not resetting the "mini display histogram" before starting stacking, the image is nearly impossible to adjust.

No "hocus-pocus", just my experience smile.gif


  • StarCurious, GSBass and MarMax like this

#50 alphatripleplus

alphatripleplus

    World Controller

  • *****
  • Moderators
  • Posts: 125,366
  • Joined: 09 Mar 2012
  • Loc: Georgia

Posted 30 July 2021 - 09:04 PM

+1. I follow Roel's procedure when using SharpCap. ( In my case, if I adjust the mini histogram, it is is to help with the image used for a platesolve.)


  • StarCurious likes this


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics