Let me start off by saying that I love DeepSkyStacker (here referred to affectionately as “DSS”). I think Luc Coiffier is a hero to the astroimaging community for creating it and making it available for free. DSS does 95% of everything you could ever want to do when calibrating, aligning, and stacking images. The Drizzle algorithm is terrific. It has the most straight-forward, easy to use interfaces around. Lately, I’ve been playing with PixInsight’s stacking tools, but I keep coming back to DSS for its simplicity and its image quality scoring, which makes determining which images to exclude from a stack so much less tedious.
I have been having an issue with it recently that I’d like to highlight to other users, though. This could be user-error on my part, and I’d like to hear what I’m doing wrong if that’s the case. Googling for similar issues, I think Jerry Lodriguss posted a similar issue to the DSS Yahoo board last February.
The issue I’m seeing is this: if you save images (“Save picture to file”) with “Apply adjustments to the saved image” setting selected, the data are limited to a 16-bit resolution, even if the file is saved at 32-bits. There is no problem if you use the autosave.tif file or save the image with “Do not apply adjustments to the saved image” setting selected. I didn’t notice this issue until I started focusing more on narrowband imaging, where even 20-minute exposures are exceedingly dim. I don’t usually use DSS for stretching my images, but at some point I experimented with it, and I left the “Apply adjustments…” box checked. Even though I haven’t used DSS’s histogram tool in months, the setting remained in the “Save picture to file” dialog box, and I was getting a lot of posterization in my final images.
Below is a screenshot from PixInsight of a DSS-stacked image of IC 2177 (12 10-minute exposures taken through an H-alpha filter), saved with the “Do not apply adjustments to the saved image” setting selected as a 32-bit integer FITS. Note the smooth histogram.
And here is a screenshot from PixInsight of the exact same image saved in DSS with the “Apply adjustments to the saved image” setting selected, also as a 32-bit integer FITS. Nothing was actually done with DSS’s histogram tool (“Apply” was never clicked, though it appears to apply the default stretch without clicking it). Note the posterization and combed histogram.
Once I get a free weekend, I need to go back through some of my recent images that show significant posterization and re-process them.
Again, DSS is amazing software that deserves to be on every imager’s computer, but this post is just a word of caution about what settings you use when you save your files.
This is a false color image of the region of Orion that would be his sword, hanging from the leftmost star of the belt, Alnitak. The concentration of nebulosity here makes this image almost an astroimaging cliche, but it’s certainly a cliche worth capturing. The image is about six degrees tall, but with the STL-11000M, such a wide field barely took two overlapping frames to cover. This was another test shot that turned out to be good enough to process. Ideally, I’d capture more data, but it’s been cloudy (or I’ve been traveling for work) for over two months since the original data were captured.
At the top is M42 and M43, possibly the most imaged nebula complex in the night sky. This makes sense, as they are so bright, you can see them with your naked eyes, and even a small telescope reveals detail. At the bottom is the much dimmer Horsehead Nebula and Flame Nebula. The whole region contains significant ionized gas. This image represents about nine hours of exposures taken over two nights. As usual, SII is mapped to red, H-alpha to green, and OIII to blue, with some further modification in Photoshop.Image data: Exposures: 15 x 600s Ha , 21 x 600s O-III, 22 x 600s S-II Software: guiding by PHD, stacking in DeepSkyStacker Processing: PixInsight, Photoshop CS3, modified Hubble palette Telescope: Takahashi FSQ-106ED Camera: SBIG STL-11000M with Astrodon 6nm narrowband filters, 2×2 binned Mount: CGEM January 7-8, 2013
After three years of hard work, I am proud to release my book on the process of astronomical imaging, The Deep-sky Imaging Primer. If you’d like to learn more about how to create deep-sky images, this book covers everything from optics and sensors to Photoshop and autoguiding. It’s packed with great information, starting with the fundamentals like signal and noise, moving through equipment considerations, and wrapping up with the details of image processing, including some start-to-finish examples. It is full-color throughout, with over 90,000 words and almost 200 illustrations.
The book is available for $40 on Amazon.com or through this blog for $35. Click here or on the thumbnail at right to learn more!
After taking the luminance exposures that were intended to be a test for the new STL-11000M, I couldn’t help but dig up some old DSLR exposures of M31 and merge their color data into the image.
This is 18×10 min luminance with an SBIG STL -11000M through an NP-101 at the native f/5.3 combined with color data from 169×4 min (11+ hrs!) taken with a Canon 450D through an 80 mm refractor at f/6 last November.
I recently purchased an SBIG STL-11000, and my biggest concern was whether buying a camera with such a large sensor was going to require me to buy a new telescope to cover that sensor. My main scope is a Televue NP-101 (the non-is version), which means I have to shoot through the 2″ focuser. While I’d trade up to the NP101-is or even a Tak FSQ-106 at the right price, let’s face it: those are very expensive upgrades.
Last night was first light with the new camera, and I’m posting this information to help others make a similar decision. Two important caveats here:
- Focus was a little off. I think I need to recalibrate FocusMax, and this was just meant to be a “first light” test of the camera. So compare the relative sharpness of the corners, not the overall sharpness.
- No polar alignment was done, so there is a little field rotation evident.
First is the question of whether the NP-101 can deliver sharp stars all the way out to the corners. Let’s have a look at the full image first.
This is 18 10-minute exposures taken through a luminance filter, synthesized in DeepSkyStacker. When I loaded up the files in Photoshop, it was impressive to see their scale: 4000 pixels across! Now, let’s look at the corners:
This is sharp enough for me, especially with the field rotation evident. I am really impressed with the edge performance of the NP-101. At this point, I’m not seeing a need to upgrade to the FSQ or 101-is.
Second, we have to consider the light fall-off. I braced myself for considerable vignetting. Here is the master flat with levels on an 8-bit scale marked in green:
Again, to me this is acceptable performance, though less than ideal. There is about a third reduction in light at the corners versus the center. That’s a lot, but it’s not nearly as bad when you move just a little bit inward. I figure with the usual cropping and overlapping of frames that happens, this won’t be much of an issue. It’s almost the same levels of vignetting I’ve seen on the ST-8300′s chip when using this scope at f/4.3 via the reducer. Careful processing there proved that the vignetting wasn’t a problem.
So what’s the verdict? The NP-101 is perfectly acceptable for use with the STL-11000. My guess is that the -is version with its larger focuser would perform better, but until I see a deal on one of those, I think I can happily image with this combination. If anyone has similar information for the STL-11000 with either of the Televue -is scopes or the Tak FSQ scopes, please post a comment for comparison.
Since the weather isn’t cooperating enough for me to capture more images of NGC 7000, let’s use the H-alpha channel image to show a few ways to create contrast in otherwise “flat” looking nebula images.
Here is the image basically right out of calibration, with only a single quick curve applied to stretch it a little. This is six hours of total integration time (20 minute subexposures).
It’s okay, but nothing stands out. There nebulosity has no definition. It’s more like fog than a distant cloud. Before anything else, the first step is to shrink the stars, because the unsharp mask filter I’ll use in subsequent steps has will exaggerate the stars. You could move the stars to separate layer, but this is a short example, and that would require another tutorial entirely. Here, all we do is run the minimum filter (Filter > Other > Minimum) with a radius of 1. The original image size is about 3100 x 2400, so even the small stars can withstand the Minimum filter at this setting. Smaller images may need to be resampled to avoid the destruction of smaller stars.
A look at the histogram reveals that the image isn’t taking up the full dynamic range, so we’ll use Curves to bring the white point down. We’ll also subtly shape the bottom end of the curve so we bring up the darkest parts of the nebula at the same time. This is why I always use Curves instead of Levels, even for simple adjustments.
Now, let’s create contrast between the smaller structures in the image. The next image shows the results after unsharp mask (Filter > Sharpen > Unsharp mask) with a radius of about 10, a threshold of 3, and a strength of about 150%. This was done to a layer containing a duplicate of the original image, and the layer was then set to the Soft Light blending mode. Every image will require different settings, but the beauty of the filter is that you can preview the results as you make adjustments to highlight the details you want. Because this filter reduced some of the darkest areas around the periphery more than I’d like, I used a layer mask to mask those areas off from the effects of the filter.
Things are looking much better. The edges are better defined now, but I’d still like to create more contrast within the core of the nebula. So I’ll duplicate the image again to a new layer and apply the unsharp mask again, this time with a radius of 90 pixels. This creates contrast across larger structures.
Finally, it’s time for one more application of curves. I’m still not happy with the contrast across the body of the nebula. A subtly shaped curve will allow me to choose the areas I want to differentiate. By holding down on control while clicking the image, I can create anchor points, shown below as A, B, and C.
The image is in pretty good shape now, at least for a monochrome image:
Now, let’s hope the skies clear so I can capture some S-II and O-III images to combine with this one!
This is most of the nebula IC1396. IC1396 is huge, about the width of 10 full moons, and it lies in the constellation Cepheus. In astronomical terms, it’s pretty close at only 2000 light years away. The structure in the bottom right is the famous “Elephant’s Trunk,” (vdB 142) which is thought to be an active star-forming region. The largest black dust lane in the middle is Barnard 161.
[ASIDE: There is a tiny blue blotch directly below Mu Cephei, the big yellow star on the left. It sticks out like a sore thumb in the O-III images, is visible in the H-alpha images, but it's absent from the S-II, hence its blue color here. After a good bit of research, it appears that this semi-anonymous planetery nebula is known as PN G100.4+04.6. I can find very little else about it other than the basic stats in SIMBAD. If anyone knows more, let me know.]
As with most of my recent images, this one combines images taken through narrowband filters into a mapped color image where the Sulfur-II emission line is mapped to red, Hydrogen-alpha (+Nitrogen-II) to green, and Oxygen-III to blue.
Exposures: 13 x 1200s Ha , 22 x 1200s O-III, 12 x 1200s S-II (15h 40m total)
Software: guiding by PHD, stacking in DeepSkyStacker
Processing: Photoshop CS3, modified Hubble palette
Telescope: Televue NP101 with 0.8x reducer (at about f/4.3)
Camera: SBIG ST-8300M with Baader standard narrowband filters, 2×2 binned
October 17, 20, and 21, 2012