Jump to content

  •  

CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.

Photo

Netbooks; what works, what doesn't

This topic has been archived. This means that you cannot reply to this topic.
92 replies to this topic

#76 psonice

psonice

    Apollo

  • -----
  • Posts: 1,113
  • Joined: 24 Jul 2009

Posted 22 September 2009 - 03:15 PM

Fogboundturtle: a 13" macbook lasts 7-8 hours, so it'll probably do 4+ running my app in the field. It's not a lot less portable than a netbook too - they're pretty small, thin and light.

The difference will be that instead of taking all those images, then coming home to process away and finally get an image, you'll sit there watching it expose with the processing done at the same time. You'll be able to check focus, sharpen the image, adjust the colour curves etc., and see very early how it's going to come out. You'll see any mistakes and be able to correct them. And you won't waste hours when you get back home (I know a lot of people enjoy that, but not all of us have the time :( ).

#77 Fogboundturtle

Fogboundturtle

    Viking 1

  • -----
  • Posts: 826
  • Joined: 20 May 2009

Posted 22 September 2009 - 04:01 PM

Fogboundturtle: a 13" macbook lasts 7-8 hours, so it'll probably do 4+ running my app in the field. It's not a lot less portable than a netbook too - they're pretty small, thin and light.

The difference will be that instead of taking all those images, then coming home to process away and finally get an image, you'll sit there watching it expose with the processing done at the same time. You'll be able to check focus, sharpen the image, adjust the colour curves etc., and see very early how it's going to come out. You'll see any mistakes and be able to correct them. And you won't waste hours when you get back home (I know a lot of people enjoy that, but not all of us have the time :( ).


most of time, you have a very limited windows to take exposure. I do not have time to fiddle around too much. I am trying to take advantage of good seeing/transparency condition to take as much exposure as I can.

#78 psonice

psonice

    Apollo

  • -----
  • Posts: 1,113
  • Joined: 24 Jul 2009

Posted 22 September 2009 - 04:17 PM

most of time, you have a very limited windows to take exposure. I do not have time to fiddle around too much. I am trying to take advantage of good seeing/transparency condition to take as much exposure as I can.


yeah I know. Which is why all data recieved from the scope can be saved in the background. You can play around all you want to make lots of photos, then go home and work with the original data. You're not losing anything.

#79 ccs_hello

ccs_hello

    Voyager 1

  • *****
  • Posts: 10,739
  • Joined: 03 Jul 2004

Posted 22 September 2009 - 06:13 PM

Just for fun, this picture shows Netbook (Advent 4211 ~= MSI Wind) runs the other Operating System :).

Clear Skies!

ccs_hello

#80 mclewis1

mclewis1

    Thread Killer

  • *****
  • Posts: 21,276
  • Joined: 25 Feb 2006

Posted 22 September 2009 - 06:29 PM

The difference will be that instead of taking all those images, then coming home to process away and finally get an image, you'll sit there watching it expose with the processing done at the same time. You'll be able to check focus, sharpen the image, adjust the colour curves etc., and see very early how it's going to come out. You'll see any mistakes and be able to correct them. And you won't waste hours when you get back home (I know a lot of people enjoy that, but not all of us have the time :( ).


That could be a lot of fun. But it does mean having to bring quite a bit of horsepower out to the scope, something that at really dark remote sites isn't as likely to occur as in your backyard. In fact though I'm doing this today using a Mallincam and capturing the video stream to either a DVR or a PC with a USB capture device. OK, I'm not doing this for the final image, the live images are for immediate visual appreciation and the saved images are for recalling the whole evening's observing but some folks have created some nice results from those stacked images. It sounds like your app will be able to do this type of work without requiring the real time video feed opening up the option for folks who today image all night and then process later. On the fly image processing does sound like a very interesting idea.

I am however very concerned about the commercial viability of such an app that runs on portable Mac hardware, there just aren't that many of them around in the amateur astronomy community.

And I guess that we've thoroughly hijacked this thread too ... sorry StarWars and I just know we're gonna get a finger waggling from Ron. :shameonyou:

:wron:

#81 psonice

psonice

    Apollo

  • -----
  • Posts: 1,113
  • Joined: 24 Jul 2009

Posted 23 September 2009 - 01:05 AM

Just for fun, this picture shows Netbook (Advent 4211 ~= MSI Wind) runs the other Operating System .


That's nuts :D Shows how useful expose could be on a netbook though. Spaces would be great too. I've been tempted to get a netbook + put osx on it, but it's near impossible to find hardware that's fully supported. Most often the wifi doesn't work, and the GPU only half works so the UI is slow (both dealbreakers for me :( ).

That could be a lot of fun. But it does mean having to bring quite a bit of horsepower out to the scope, something that at really dark remote sites isn't as likely to occur as in your backyard. In fact though I'm doing this today using a Mallincam and capturing the video stream to either a DVR or a PC with a USB capture device. OK, I'm not doing this for the final image, the live images are for immediate visual appreciation and the saved images are for recalling the whole evening's observing but some folks have created some nice results from those stacked images. It sounds like your app will be able to do this type of work without requiring the real time video feed opening up the option for folks who today image all night and then process later. On the fly image processing does sound like a very interesting idea.

I am however very concerned about the commercial viability of such an app that runs on portable Mac hardware, there just aren't that many of them around in the amateur astronomy community.

And I guess that we've thoroughly hijacked this thread too ... sorry StarWars and I just know we're gonna get a finger waggling from Ron. :shameonyou:

:wron:


Very true, I'll answer that and try to turn the topic back round too :)

Horsepower: it mostly runs on the GPU, so it doesn't actually need that much. An ION based netbook would run it (if you could get osx to work), depending on the video source. If you're capturing at low framerates to let in more light, it should be no problem and high resolutions would be possible. On the mac side, a 13" macbook pro is very portable, or an air would run it if you're not on a budget ;)

For your usage, you'd basically get the same immediate visuals, but with the enhancement stuff you should get a much better picture - closer to that 'final image'.

I'm pretty sure it won't be commercially successful - it'll be free ;) The reason I got started on this, was I bought a cheap scope a while back and wanted to do some imaging. I looked into the options for using a webcam, and couldn't believe the long winded and complex process just to get an image. Most of the software I looked at seemed powerful enough, but horrible to use. Writing my own tools from scratch looked like it could take less time than learning to use all the other stuff effectively :/ So really, I'm writing this for me to use. It'll be pretty simple, but should produce good images (way beyond raw camera output, and getting close to the final images produced by the traditional method). I'm not trying to match the other software out there on features, and it probably won't cut it for more professional use, at least initially. For a family gathered round their scope in a backyard, it should be a huge step forwards :)

To get back to the topic though, my original point was this: GPGPU is getting very easy to do now, and it's enormously powerful for imaging. It could totally transform what you do at the scope. If more imaging apps start doing this, you'll want to use that software because the benefits will be huge.

A basic netbook won't run software like this though - that intel GPU is near useless for anything like this (not only because it's slow, it doesn't even have a lot of the features it's advertised as having!). One of the new ION netbooks on the other hand will work, but it'll cost more and possibly not get the same battery life. A small laptop with a non-atom CPU and more memory will be much better, but even more expensive and with even worse battery. It's something I think people should be considering now if they're buying new kit they want to last a while :)

There's another issue btw - if you want to use GPGPU and get all this extra power and you're using windows, you'll probably need win7. Most of your existing kit won't be happy, because drivers aren't available. Best check that out first. It's not an issue on OSX, no idea on linux.

#82 rboe

rboe

    ISS

  • *****
  • topic starter
  • Posts: 69,657
  • Joined: 16 Mar 2002

Posted 23 September 2009 - 09:28 AM

I hope you start another thread on your software when it gets close enough to do so. Right now my big hold back is an older G4 with USB 1.1. :bawling:

#83 StarWars

StarWars

    Mr. Postmaster Man

  • *****
  • Posts: 33,547
  • Joined: 26 Nov 2003

Posted 23 September 2009 - 07:44 PM

I hope you start another thread on your software when it gets close enough to do so. Right now my big hold back is an older G4 with USB 1.1. :bawling:



Ron,

Will you Please upgrade... :praying:


Posted Image

#84 rboe

rboe

    ISS

  • *****
  • topic starter
  • Posts: 69,657
  • Joined: 16 Mar 2002

Posted 24 September 2009 - 10:17 AM

I'll tell my wife you said I should. :)

#85 psonice

psonice

    Apollo

  • -----
  • Posts: 1,113
  • Joined: 24 Jul 2009

Posted 24 September 2009 - 11:14 AM

Ron: find something that she'll like but won't run well on it. Mine likes watching the TV she missed on the BBC iplayer website, but it's too slow and jerky. She's looking at a nice new one now ;)

#86 rboe

rboe

    ISS

  • *****
  • topic starter
  • Posts: 69,657
  • Joined: 16 Mar 2002

Posted 24 September 2009 - 07:15 PM

Her needs are much less than mine. Pretty low maintenance, so that approach won't work.

#87 groz

groz

    Vanguard

  • *****
  • Posts: 2,148
  • Joined: 14 Mar 2007

Posted 24 September 2009 - 07:28 PM

The difference will be that instead of taking all those images, then coming home to process away and finally get an image, you'll sit there watching it expose with the processing done at the same time. You'll be able to check focus, sharpen the image, adjust the colour curves etc., and see very early how it's going to come out. You'll see any mistakes and be able to correct them. And you won't waste hours when you get back home (I know a lot of people enjoy that, but not all of us have the time :( ).


That sounds real cool, but, out here in the real world, after one gets over the initial desire for instant gratification, the process of data acquisition for astrophotography is a long, drawn out, boring sequence of events. I'm using a starlight xpress sxv-h9 ccd, small chip, very sensative. Use my imaging plan for tonite as an example, I'm doing a 4 frame mosaic of M33. Here's how the process works, out in the real world.

T = sunset, times in minutes.

T - 30 : Uncover scope, power up camera, let coolers get started.
T - 15 : Guider can resolve bright stars, slew to vega, sync mount.
T - 10 : Start shooting flats with each filter
T + 10 : Time window for flats nearing an end, to much starlight getting thru the t-shirt filters. If we are on the road, and just set up from scratch, this is the time to look after polar alignment, and do the 3-star align.
T + 15 : Getting dark enough to get good target resolution. Slew to bright star in vicinity of target, check focus with the L filter in.
T + 20 : Slew to target center, take a 2 minute exposure, use a plate solve to sync the mount. Fine tune the slew onto the target after the plate solve, take another exposure to validate framing. Once framing is validated, slew to the first mosaic offset point, and start guiding.
T + 45 : Take a 5 minute exposure, and check if it's really dark enough to start exposures. Probably not, but leave the camera shooting 5 minute exposures to keep checking.
T + ~65 : It's now dark enough to shoot luminance frames, re-check focus and then start the main imaging run.

Main imaging run consists of 'Shoot frames on each of the 4 points that define the mosaic'. tonite we are doing some camera/telescope tests, so, I'll only shoot 5 frames of 5 minutes on each point. The process basically works out to this:-

For each filter in LRGB
For each point in mosaic
slew to target point
Check/adjust focus
Expose 5 times 5 minutes
Next mosaic point
Next filter


So, there's 4 points in the mosaic, and I need 5 frames on each point. 25 minutes per point with 4 points means, a little over an hour and a half per filter. I'm doing LRGB, so, repeat 4 times, for a total of 6+ hours of exposures. Because of the way my filters work, I'll actually shoot L first, followed by G, then R, then finally shoot B. I'll have to do a meridian flip between the R and B frames.

As long as my images are focussed (and my software is checking focus on each frame as it arrives), there's no tweaking required during this process. Nothing I do at the computer playing with color adjustments etc is going to change the data coming from the ccd with each filter. There's no point trying to consider what the final result will look like until after I've captured with all 4 filters. My software will look after the changes of pointing and focus required between frame locations, and between filters. There is no reason for me to sit and watch, it's rather boring, there's only something happening once every 5 minutes.

When I first started doing astrophotography, I thought it would be 'really cool' to have programs that worked with data on the fly as we collected it, and, I wrote stuff to do just that, fuss with the frames coming out of our dslr cameras as we captured them. but, then we got pretty good at setting up the gear, framing targets, and getting / keeping focus etc. The desire for 'instant gratification' kind of went away. Now we know, if the framing and focus is good, seeing is good, and tracking is good, then we'll have good data. sitting and watching it just increases the temptation to 'fiddle' with the gear, and, if you fiddle with the gear while it's collecting frames, you get bad frames. We've learned that lesson, the hard way.

Watching an astrophotography system collect data, once you get over the intial 'oh I spent thousands of dollars, lets watch the gear work' phase, is boring. It's very much like watching grass grow. I've been doing this now for almost 4 years, and using 'wrote it all myself' software from the beginning. When I first started, we had lots of bells and whistles for fiddling with images as they came from the dslr. Today, now that we are using ccd cameras, for the most part, the images dont even end up on the screen during the capture process. the process of writing software to manage all of this is always a case of 'killing off what annoys me most today' when it oomes to the 'next feature to add'. For the last year, it's been a process of adding automation features, so there's less and less parts of the process that need human intervention in the timespace from 'Dark' till the imaging run is complete.


Above was typical sequence for a 'pretty picture' run, like we are doing tonite. If we are doing a photometry run, which we often do to measure exoplanet transits (our newfound fun thing to do with all this gear), the process is even MORE boring. It boils down to this.

Slew to target.
Select potometric V filter
Check frame for plate solve to sync mount
fine tune slew to target, ensureing target and reference stars are in frame
Start guiding
Take a check frame to validate exposure (target star should be 1/2 to 2/3 of the full well depth on the ccd).
shoot frames till meridian
Stop guiding
Goto target (causes meridian flip)
Check frame for plate solve to sync mount again
fine tune slew for target
Start guiding
Shoot frames till sunup.
Stop guiding
Park scope.

There is nothing to watch, nothing to see, just a boring star field on screen, no colors to play with, and we are likely slightly defocussed on purpose, so, dont even have to fiddle with focus all night. The data aquisition part of this is just plain boring, but, what is really 'not boring', is running that batch of frames thru the photometry analysis, and ending up with charts on the screen that show definitively, yes, during the night while I was sound asleep, my telescope did indeed record the transit of a planet in front of another star, and my light curve has a 22mmag dip to prove it.

But hey, if you want to write something that does lots of processing in real time to play with telescope toys, fill your boots. When I first started, that's exactly what I did too. I've long since learned, and now I understand, when there's discussion of software in the equipment specific forums, it's more about 'how to automate' than 'how to do wiz bang in real time'. This is what folks really want, and, I'm now part of that crowd too. But I will say this, the discussion over in the other thread where you asked about alignment algorithms, I dug up some old links laying around with references to documents on them. Well, that got me thinking that day, and I went a couple days later, re-read those documents carefully, and started writing code based on them, and that's why our process now has the 'plate solve' function in it, I wrote a plate solver based on those documents, works rather well I may add. Today I'm using data from the GSC-1.2 because 'its what we have on all our computers', but our little astronomy project here at home has reached the point, I'm starting to consider a much larger catalog on one of our servrs in the closet....

Oh, and ofc, now to get back off the tangent, and back onto the real topic of this thread, it all works just fine on my netbook too :) We have _almost_ reached the point where a photography run consists of 'enter a target into the dialog box on the netbook, come back and look in the morning to see what data files are on the server in the closet'. Now that i can do plate solutions to fine tune pointing based on frames from both the imager, and the guider, I've gotten a huge step closer to a 'fire and forget' system....

#88 psonice

psonice

    Apollo

  • -----
  • Posts: 1,113
  • Joined: 24 Jul 2009

Posted 25 September 2009 - 05:28 AM

Groz: Yep, I fully understand what you're saying there. I'm not looking at this kind of usage at all really though - there's no support for LRGB imaging, and I'm not looking at long exposure times either.

I guess there's two kinds of astronomy - the scientific kind, collecting data and the like, and the 'getting out under the stars and enjoying the sights' kind. My app is made for the latter - it'll help me to see the things my scope isn't capable of showing me unaided. And I don't want to sit there waiting for a picture, I want to be out there actually looking at things and exploring. Besides which, my scope isn't even motorised, so long exposure is impossible.

Apart from that, I'm not convinced that long exposure is a good idea. The CCD will gather more light, but the computer can do this from multiple frames and the CCD is a 'dumb' piece of hardware - it's not aware of any changes happening in the image so it'll happily destroy data rather than correcting any issues. The computer on the other hand is pretty smart, if you tell it what you want. Couple of examples: You knock the scope during an exposure. That shake gets captured, degrading the image, and perhaps the scope goes slightly out of alignment. With the computer, it can detect that, reject the data until the scope settles, and correct the misalignment. Another: You take an image of a dim DSO with bright stars around it. The stars will get clipped heavily, while the DSO will look good. The computer can apply a curve to the image so that the DSO looks good and the stars don't get clipped. You can do things like fourier analysis to help amplify the data and remove noise sources along the way too (I'd imaging that would be very helpful for your exoplanet stuff).

#89 rboe

rboe

    ISS

  • *****
  • topic starter
  • Posts: 69,657
  • Joined: 16 Mar 2002

Posted 25 September 2009 - 09:30 AM

Dang; still not quite sure if you are answering a question no one has asked - but it sounds very interesting. Once I upgrade my macbook (don't hold your breath, I've been talking about this for two years already) I'd like to check this out with my webcam.

#90 AlienFirstClass

AlienFirstClass

    Vendor

  • -----
  • Posts: 1,254
  • Joined: 13 Feb 2009

Posted 26 September 2009 - 12:31 AM

So what netbook has the best screen for daylight operations?

Some of us do solar work.

#91 Peter Argenziano

Peter Argenziano

    Watcher of the Skies

  • *****
  • Posts: 3,642
  • Joined: 11 Nov 2003

Posted 27 September 2009 - 04:20 PM

So what netbook has the best screen for daylight operations?

Some of us do solar work.


IMHO one that has a matte display instead of the more widely available glossy screen.

#92 psonice

psonice

    Apollo

  • -----
  • Posts: 1,113
  • Joined: 24 Jul 2009

Posted 28 September 2009 - 08:36 AM

So what netbook has the best screen for daylight operations?

Some of us do solar work.


IMHO one that has a matte display instead of the more widely available glossy screen.


I agree - although it depends on the exact way you're using it, sometimes the glossy display can be better (well, less bad).

Assuming you'll be facing towards the sun with the laptop in front of you, the laptop screen will be in the shade and your face will be in the sun. A matte screen will be OK here, a glossy screen will have a very bright reflection of your face on the screen.

If the screen is actually in the sun though, a matte screen will catch the light and be very hard to see. Gloss is probably better here - especially if it's a flat/glass covered screen (some aren't totally flat and catch reflections from all over the place).

Also note that screen quality varies HUGELY. Look for reviews that praise screen quality, or go see it for yourself.

#93 Peter Argenziano

Peter Argenziano

    Watcher of the Skies

  • *****
  • Posts: 3,642
  • Joined: 11 Nov 2003

Posted 28 September 2009 - 08:52 PM

I agree that you should see it before purchase.

For my work (non-astro) I prefer matte displays because calibration (profiling) is more accurate.


CNers have asked about a donation box for Cloudy Nights over the years, so here you go. Donation is not required by any means, so please enjoy your stay.


Recent Topics






Cloudy Nights LLC
Cloudy Nights Sponsor: Astronomics