Archive

Posts Tagged ‘Andromeda’

M31 – The Andromeda Galaxy

November 14, 2012 Leave a comment

After taking the luminance exposures that were intended to be a test for the new STL-11000M, I couldn’t help but dig up some old DSLR exposures of M31 and merge their color data into the image.
This is 18×10 min luminance with an SBIG STL -11000M through an NP-101 at the native f/5.3 combined with color data from 169×4 min (11+ hrs!) taken with a Canon 450D through an 80 mm refractor at f/6 last November.

A quick evaluation of the NP-101 for use with the SBIG STL-11000

November 10, 2012 Leave a comment

I recently purchased an SBIG STL-11000, and my biggest concern was whether buying a camera with such a large sensor was going to require me to buy a new telescope to cover that sensor.  My main scope is a Televue NP-101 (the non-is version), which means I have to shoot through the 2″ focuser.  While I’d trade up to the NP101-is or even a Tak FSQ-106 at the right price, let’s face it:  those are very expensive upgrades.

Last night was first light with the new camera, and I’m posting this information to help others make a similar decision.  Two important caveats here:

  1. Focus was a little off.  I think I need to recalibrate FocusMax, and this was just meant to be a “first light” test of the camera. So compare the relative sharpness of the corners, not the overall sharpness.
  2. No polar alignment was done, so there is a little field rotation evident.

First is the question of whether the NP-101 can deliver sharp stars all the way out to the corners.  Let’s have a look at the full image first.

This is 18 10-minute exposures taken through a luminance filter, synthesized in DeepSkyStacker.  When I loaded up the files in Photoshop, it was impressive to see their scale:  4000 pixels across!  Now, let’s look at the corners:

This is sharp enough for me, especially with the field rotation evident.  I am really impressed with the edge performance of the NP-101.  At this point, I’m not seeing a need to upgrade to the FSQ or 101-is.

Second, we have to consider the light fall-off.  I braced myself for considerable vignetting.  Here is the master flat with levels on an 8-bit scale marked in green:

Again, to me this is acceptable performance, though less than ideal.  There is about a third reduction in light at the corners versus the center.  That’s a lot, but it’s not nearly as bad when you move just a little bit inward.  I figure with the usual cropping and overlapping of frames that happens, this won’t be much of an issue.  It’s almost the same levels of vignetting I’ve seen on the ST-8300’s chip when using this scope at f/4.3 via the reducer.  Careful processing there proved that the vignetting wasn’t a problem.

So what’s the verdict?  The NP-101 is perfectly acceptable for use with the STL-11000.  My guess is that the -is version with its larger focuser would perform better, but until I see a deal on one of those, I think I can happily image with this combination.  If anyone has similar information for the STL-11000 with either of the Televue -is scopes or the Tak FSQ scopes, please post a comment for comparison.

In Defense of Mapped Color Imaging

October 21, 2012 Leave a comment

Bob Berman recently wrote a column in Astronomy (October 2012) where he says of mapped color imaging, “It’s Disneyland meets Sagittarius.  But is it ethical?”  He admits that “Emission nebulae are always the same repetitious shade of red,” but describes (accurately, I’d allow) Hubble palette images as “fake colors” and “lovely but unreal.”  All fair points.  And it seems that the Astronomy editorial board generally agrees–in the same issue, they show 100 of the “greatest” pictures of the universe.  Less than 20% are mapped color images by my estimate, and nearly all of those are from either ESO or NASA.  Images of emission nebulae from amateurs, with only a couple of exceptions, are limited to true-ish colors.  I take the message to be, “we like our nebulae red, unless you are a professional.  (It’s still okay to turn the saturation knob to 11 for galaxy image, though.)”

The question of ethics in image processing goes back to the beginning of photography, though electronic imaging certainly brings the issue to the fore.  Should we expect an image to exactly represent reality as we would perceive it?  The question comes up most frequently with fashion photography:  when a model’s skin is digitally airbrushed, is that an acceptable improvement or has a line been crossed?  But images have been enhanced for nearly as long as they have existed.  Ansel Adams extensively burned and dodged his prints, not only to increase the perceived dynamic range, but also for purely aesthetic reasons.  These were subjective alterations designed to emphasize the more interesting objects by creating artificial contrast.

The problem with such questions is that they are by definition subjective:  an image is never an accurate rendition of reality.  Cameras and lenses are nothing like our visual system, so photography’s relationship with reality can only be described on a spectrum (pardon the pun) that starts with “slightly inaccurate” and goes to “completely fabricated.”  If we truly limited ourselves to staying close to the way our eye-brain system “sees” things, then nearly everything would be out of bounds.  No wide-angle or telephoto lenses. No exposures longer than a second, so no  images where motion is blurred.  The very fact that we use long exposures in astronomical imaging should disqualify every image, since the human eye can integrate less than a second of photons in total.  Certainly any radio frequency, x-ray, or ultraviolet spectrum images could not be allowed, and at best you could argue that they should be monochromatic.

The point is that images are not reality, but aesthetic representations of reality.

It seems that Mr. Berman’s primary issue is with mapped colors in narrowband images.  I see the point.  Not only are we assigning light from specific emission lines to different colors in the final image, but we also equalize these spectral lines so they are approximately equally bright, when the reality is that one of them (H-alpha) is typically far brighter than the others.  H-alpha, N-II, and S-II are all red.  O-III is blue-green.  That is the extent of our palette for the vast majority of emission objects in the sky, which when rendered as the eye would see it amounts to red.  And visually, they are all too dim to trigger cone cells anyway, so they are all grey unless you can put an eyepiece in the Keck telescope (which would have too narrow of a field, even if you could!).  The diagram below shows how the Hubble palette maps the narrowband emission lines to RGB color.

For me, mapped color narrowband images reflect reality even better than a “true-color” image.  Nearly all of the visual spectrum energy from these objects is emitted in these three or four narrow wavelength slices, each one emitted by a different type of ionized gas.  Emission nebulae are not broad spectrum objects, so to render them the same way we would a galaxy, or for that matter a family photo, seems to fall short of their potential.  We have readily available filters to separate these emission lines, so why would we choose to render them in a way that doesn’t distinguish them in the final image?

I avoid any arbitrary adjustments, additions, or deletions to my images, other than fixing optical defects like dust motes.  (I think Apple’s deletion of M110 from an image of the Andromeda Galaxy for the OS X Lion wallpaper was clearly beyond the pale, but then again, commercial images have different goals.)  I don’t believe that mapping emission lines from ionized gases in a nebula is augmenting reality; it reveals structure that is quite real, but would otherwise be unresolved.  I think it is perfectly acceptable to create contrast in luminance or color that reveals and emphasizes what is actually there.  Just because it’s not visible at the eyepiece doesn’t make it a fake.

%d bloggers like this: