Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

How much error can PixInsight correct with camera orientation (e.g. between sessions)?

  • Please log in to reply
14 replies to this topic

#1 ryanha

ryanha

    Messenger

  • -----
  • topic starter
  • Posts: 438
  • Joined: 05 Aug 2020

Posted 21 October 2020 - 08:44 PM

I want to do an imaging project where I am capturing over multiple nights where I need to tear down my setup each night.

 

How close do I need to get the camera orientation to be from one session to the next for the integration step to correctly align the images?

 

I was planning on making a small tape mark and eyeballing it based on that. Do I need to be ultra-high precision with this or is aligning a tape mark fine?

 

Thanks,

--Ryan



#2 ismosi

ismosi

    Apollo

  • -----
  • Posts: 1,017
  • Joined: 19 Apr 2014
  • Loc: New London, Pennsylvania, USA

Posted 21 October 2020 - 08:50 PM

I should think you'd be fine. When doing StarAlignment PI also accounts for rotation. Since I image using a German equatorial mount, some of my images are 'upside down' with respect to the others which PI can easily handle.


  • Michael Covington likes this

#3 Alex McConahay

Alex McConahay

    Cosmos

  • *****
  • Posts: 9,625
  • Joined: 11 Aug 2008
  • Loc: Moreno Valley, CA

Posted 21 October 2020 - 09:11 PM

Have you ever had a session where something shifted----maybe the frames before meridian flip were slightly off from the frames after the flip----maybe for some dumb reason, you did not perfectly center one set of subs the same as the other set. 

 

After you register and stack the images, you will find that the edge of the stacked image is irregular---not as dense or something--as the rest of the image. 

 

That is what will happen when the rotation (or centering) is not the same from one image to the next, or from one night worth of images is not the same as another.  You will have a big chunk of image where all the subs contributed. But, since you have a rotation problem with some of them, you will have edges (probably long, skinny triangles) where things are not supported by as many sub frames, and so are noisier, lighter, etc.

 

How much of that you can tolerate depends on how much you can still crop out and have your main target supported by all of the subframes. 

 

It really is your call. 

Alex



#4 Alex McConahay

Alex McConahay

    Cosmos

  • *****
  • Posts: 9,625
  • Joined: 11 Aug 2008
  • Loc: Moreno Valley, CA

Posted 21 October 2020 - 09:14 PM

Oh, by the way.....Pixinsight is not "correcting" any image rotation issue. It is merely registering what is there. IMages can be 180 degrees apart, and PI will have no problem with registration. (I mean, after all, this is exactly what happens with a meridian flip!!!)

 

 

The frames will all be registered to the Master no matter what their original rotations. It is just that when you stack them all, you will find that the edges are not uniformly supported by the same number of subs. 

 

Alex


Edited by Alex McConahay, 21 October 2020 - 09:14 PM.

  • dswtan and limeyx like this

#5 AstroBrett

AstroBrett

    Mariner 2

  • *****
  • Posts: 232
  • Joined: 26 Jan 2009

Posted 21 October 2020 - 09:19 PM

If your OTA is in rings on a dovetail plate and you remove the OTA and dovetail as an assembly, along with the camera, there will not be any significant rotations between image sessions. I also break down each night, and that is the method I use.

 

If, on the other hand, you can't do that, what you can do is reorient your camera during twilight as follows. Plate solve one of you subframes from your first session and note the rotation angle of the frame. When you set up the following session, take a test subframe during twilight, plate solve it, and compare the rotation angle with your desired value. Loosen the rings and rotate the OTA and repeat until you get it close to the original value. A few frames will usually do it. Then you can proceed as you normally would to acquire additional data.

 

Good luck,

 

Brett  


  • acommonsoul and limeyx like this

#6 ryanha

ryanha

    Messenger

  • -----
  • topic starter
  • Posts: 438
  • Joined: 05 Aug 2020

Posted 21 October 2020 - 09:37 PM

Plate solve one of you subframes from your first session and note the rotation angle of the frame. When you set up the following session, take a test subframe during twilight, plate solve it, and compare the rotation angle with your desired value.

 

There you go!  I would not have thought of this.  I have the Celestron OAG so this is super easy for me to do because it has a rotator assembly for the camera.

 

Thanks!

--Ryan



#7 const

const

    Sputnik

  • -----
  • Posts: 45
  • Joined: 11 May 2020
  • Loc: WA

Posted 21 October 2020 - 09:41 PM

Make sure to shoot flat frames for each night and calibrate light frames with corresponding flats.

 

If your target size is like 1/3 or less of the short side of the frame then you can rotate the camera as you wish. If the target fills the frame and you do not want to crop, then keep orientation within a few degrees. You will still crop some because of drift caused by imperfect tracking and intentional dithering, so no need to be paranoid about small rotations.



#8 SilverLitz

SilverLitz

    Apollo

  • -----
  • Posts: 1,233
  • Joined: 17 Feb 2018
  • Loc: Louisville, KY

Posted 22 October 2020 - 07:44 AM

PixInsight's StarAlignment will do the x-y shifting and axis rotation to get all the subs to align with your REF frame.

 

For NB I have found PI's new Adaptive Normalization (option in Integration) seems to effectively hide the overlap lines, though the corners with missing data on many subs will be noisier.  My recent M16 Eagle SHO combo that had data from 2 separate years, I used Adaptive Normalization and did not crop a single pixel!  However, I have not had luck with Adaptive Normalization on RGB files.



#9 endlessky

endlessky

    Viking 1

  • -----
  • Posts: 678
  • Joined: 24 May 2020
  • Loc: Padova, Italy

Posted 22 October 2020 - 10:22 AM

I want to do an imaging project where I am capturing over multiple nights where I need to tear down my setup each night.

 

How close do I need to get the camera orientation to be from one session to the next for the integration step to correctly align the images?

 

I was planning on making a small tape mark and eyeballing it based on that. Do I need to be ultra-high precision with this or is aligning a tape mark fine?

 

Thanks,

--Ryan

PixInsight will register the frames and - if you check "Distortion correction" - even correct for different lenses being used. The main problem will be that you'll have to crop (throw away data, field of view) the areas that do not stack completely throughout all the frames (they will be more noisy than the rest).

 

I am on your same boat, as I have to setup and tear down my equipment for every different session. I often take exposures on two different subjects on the same night, depending on time available and/when the first object goes out of view because of trees/houses. I frame every subject to my likings, so camera rotation is something I do often. I don't have a graduated scale on the mechanical (and manual) camera rotator at the end of my telescope focuser, so I don't have a quick way of judging how far I am from the correct rotation of the session before.

 

My workaround is this: using plate solving.

 

With the suite I am using (KStars/EKOS) there's a command in the plate solver called "Load & Slew" (other programs have similar features, I know for sure AstroPhotography Tool has it, just with a different name). You basically load an image from the session before, the software plate solves it, calculates RA/DEC coordinates of the center and take you within decided tolerance (I go for 10 arcseconds) on target.

 

The beauty of it is that it also tells you the angle of rotation of the camera (in my case, expressed as degrees East of North). I make note of that angle, immediately after the image from "Load & Slew" is solved and before it's overwritten with the new solved image from the camera, in your current session (otherwise, the angle will be the new one). I then check the new angle and go from there. I manually rotate it and keep solving the field of view, until the new angle matches the old one, as closely as I can make it.

 

With this method, I can have the center of the image within 10 arcseconds from session to session and the angle usually down to half a degree of difference from one session to the other.


  • limeyx likes this

#10 SoDaKAstroNut

SoDaKAstroNut

    Vostok 1

  • *****
  • Posts: 115
  • Joined: 24 Dec 2018
  • Loc: Black Hills, South Dakota

Posted 29 October 2020 - 01:36 AM

APT has Framing Masks option under Tools to help align (RA/Dec/Rotation) for your subs. Important to make sure to use same view size (Fit, 1:1) when setting up your masks in APT.


Edited by SoDaKAstroNut, 29 October 2020 - 01:37 AM.

  • limeyx likes this

#11 Stelios

Stelios

    Voyager 1

  • *****
  • Moderators
  • Posts: 10,479
  • Joined: 04 Oct 2003
  • Loc: West Hills, CA

Posted 29 October 2020 - 01:41 AM

If you use something like SGP with their Frame and Mosaic wizard, you can select the orientation for the target (very useful, as not all objects frame well in all directions--you don't want the Witch's Broom oriented vertically on an ASI183 camera). Then after Plate-solving SGP will tell you how much to rotate the camera to match the orientation (within a limit you desire, the default is 3 degrees). You get up to the number of tries you specify to get it right.

 

I use that on every single session. Helps maximize the usefulness of your canvas. 


  • SoDaKAstroNut and limeyx like this

#12 Stricnine

Stricnine

    Messenger

  • -----
  • Posts: 459
  • Joined: 24 Sep 2018
  • Loc: DFW Area, TX

Posted 29 October 2020 - 05:54 AM

It is worth making an effort to get the frames reasonably aligned in x-y and rotation in the camera...  As an example of what not to do... Here's a stack of M78 I imaged last year, I had to crop too much IMO.

 

M78 integration scaled

 

David


Edited by Stricnine, 29 October 2020 - 05:55 AM.


#13 kathyastro

kathyastro

    Vanguard

  • *****
  • Posts: 2,404
  • Joined: 23 Dec 2016
  • Loc: Nova Scotia

Posted 29 October 2020 - 06:47 AM

I always mount the camera the same way on the scope, so my image orientation doesn't change more than a degree or two.  At least until the target passes the meridian.  Then, it rotates 180 degrees.

 

Because a meridian flip is a normal thing, PI (and any stacking program worth its salt) has no trouble with any amount of rotation.  I did once (only once) have an image that it refused to register until I had manually rotated it 180 degrees.  I have no idea why.  It was clearly an exception.



#14 NightBear

NightBear

    Vostok 1

  • *****
  • Posts: 148
  • Joined: 12 Nov 2019

Posted 29 October 2020 - 08:58 AM

PixInsight's StarAlignment will do the x-y shifting and axis rotation to get all the subs to align with your REF frame.

 

For NB I have found PI's new Adaptive Normalization (option in Integration) seems to effectively hide the overlap lines, though the corners with missing data on many subs will be noisier.  My recent M16 Eagle SHO combo that had data from 2 separate years, I used Adaptive Normalization and did not crop a single pixel!  However, I have not had luck with Adaptive Normalization on RGB files.

For me Adaptive Normalization fails when there is any more than a small amount of non-overlap of frames for LRGB. Hopefully this will be improved in the future because AN otherwise works great and is way faster than localnormalization.



#15 SilverLitz

SilverLitz

    Apollo

  • -----
  • Posts: 1,233
  • Joined: 17 Feb 2018
  • Loc: Louisville, KY

Posted 29 October 2020 - 11:33 AM

I recently had trouble with Adaptive Normalization on a NB target that I changed the rotation by 15deg, after working great on NB targets with minor rotation/offset changes.   Hopefully, it will get better in the future.




CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics