## The image that wasn’t to be: Comet 21P

After two years in the UK, I’m living in the US again, and more importantly, *imaging* again. I don’t know how the UK imagers do it: I didn’t get a chance once in my two years there to set up.

I made the mistake of introducing too many new variables on my first night back out. A new NUC computer is controlling things now, and I was trying new software, APT, for the first time. So of course things went wrong. It’s a shame, because the alignment I was trying to catch only existed for about 2 hours. Comet 21P/Giacobini–Zinner would be passing right through a triangle of objects that consisted of M35, IC443 (the Jellyfish Nebula), and Sh2-252 (the Monkey Head Nebula).

Long story short, I got 3 4-minute exposures before the mount refused to track anymore. I was so despondent, I didn’t take any flats or darks, so this is a pretty raw process, but for what it’s worth, here’s what I got. Oh, the image this could have been with more time… I hope someone out there captured this alignment better than I did.

3 4-minute exposures. Modified Canon 6D, William Optics Star71, Takahashi EM-200.

## The SECOND EDITION is finally here!

I’m proud to announce that the second edition of *The Deep-sky Imaging Primer* is now available. It is a substantial update to the first edition, with revised and expanded text and over 325 illustrations. Printed in full color, it covers everything you need to know to capture stunning images of deep-sky objects with a DSLR or CCD camera:

- The fundamental concepts of imaging and their impact on the final image
- How to pick a telescope and camera
- How to get set up and take the images
- Where and when to find the best objects in the night sky
- How to process images using Adobe Photoshop® and PixInsight®
- Start-to-finish examples of image processing

Complete coverage of PixInsight® has been added, with workflows for both PixInsight and Photoshop®. There are also two new start-to-finish processing examples. The early chapters have been revised, with clearer explanations of important topics like noise and resolution. And a new appendix provides details of the best imaging targets for northern hemisphere imagers. Overall, it’s a bigger, more readable book, with a lot more content.

Click here or the image below to order the book from Amazon.

## Supernova Remnant G65.3+5.7

This image of the huge supernova remnant known as G65.3+5.7 in southern Cygnus, just north of Albireo, is the hardest imaging project I’ve undertaken. This image is an eight-panel mosaic of over 32 megapixels representing over 92 hours of exposure time. The field of view shown is over 5° across. G65.3 is very dim and emits mostly in OIII. As crazy as it sounds after 92 hours of imaging, I wish I’d gone deeper, since that’s only about 11 hours per panel.

I used OIII, Ha, and green filters. The green exposures were only three hours of the total, but they helped to capture star color and preserve the violets where the OIII and Ha areas overlap.

The star field in this part of the Milky Way is insanely dense, and the background also has a thin soup of H-alpha emission everywhere. Both of these, in addition to bringing together eight panels in three channels, made processing a real challenge. I processed it four separate times before I got a result I was pleased with.

The Sharpless catalog lists three of the brightest (relatively speaking) areas in the Ha channel: Sh2-91 in the lower left, Sh2-94 at center right, and the faint Sh2-96 above it. The OIII filaments connect these, but to my knowledge they are not individually cataloged.

**Exposures**: 8 panel mosaic (cropped here) consisting of 42 hours H-alpha, 47 hours of OIII, and 3 hours of green exposures, all binned 2×2. Total exposure time: 92 hours 4 minutes**Telescope**: Two William Optics Star71s (360mm f/5)**Cameras**: SBIG ST-8300M and QSI583**Mount**: Takahashi EM200**Guiding**: QHY 5L-II mono, guided using PHD2**Processing**: PixInsight and Photoshop

Clicking the image below will open the full-resolution 32-megapixel version (15 MB):

## The Matterhorn at Night

While I’m still working on processing multiple deep-sky images as well as the new book, let’s take a brief detour to Switzerland. My vacation to Zermatt was unfortunately timed with the full moon, so the skies were not as dark as I’d like, but the rising moon did nicely light up the east face.

Samyang 135mm on Canon 6D (2 seconds at f/2.0, ISO 3200)

I had hoped to capture the full moon *with* the mountain, but to get the right angle, I would have had to have hiked up in the mountains around 3 am. Needless to say, that did not happen.

## What part of the sky spends the most time in darkness?

*SUMMARY: I set out to answer the question: considering the perspective of everyone on Earth and accounting for the seasonal changes in daylight, what spots in the night sky are least and most visible? The incremental steps to get there illustrate the effect of latitude and the seasons on the night sky. The final map is shown above, but for those interested, a full explanation with code snippets is shown.*

When I moved to the London area from the US last year, I noticed that not only were the summer nights shorter than in New Jersey, but twilight seemed to go on forever. London is at about 52° N, so for a few days around the summer solstice, the sky never reaches astronomical twilight (when the sun is 15° below the horizon, which is also when it is dark enough for imaging).

So I wondered how much imaging time I get each year for each area of the sky. (Ignoring weather of course.) And beyond my view of the sky, what does the picture looks like when you consider the view from everywhere on the planet. Obviously, the summer sky isn’t in darkness as long as the winter sky, but how big is the difference? And to take the question to a larger level, what happens when you consider where people live on the earth?

I’m sure there is a way to solve this analytically for someone really talented in spherical trigonometry, but I set out to solve it numerically. I had a lot of R code that I’d written to create the maps in *The Astrophotography Sky Atlas*, so how hard could it be? It turns out that it was a good bit of effort for such an unremarkable question. (At least 30 hours, including this write up.)

Let’s start with my assumptions. A point in the sky is considered “image-able” when:

- It’s fully dark: the sun is at least 15° below the horizon
- The point is at least 20° above the horizon. Lower than that, and you’re looking through over three airmasses.

First I had to write and test the code in R to do all the positional astronomy calculations. This turned out to non-trivial, but after a couple of evenings reading and testing, I had functions that returned comparable values to the online ephemerides. The sunrise and sunset values I get differ by a few minutes from various online sources, so there are probably more precise equations available, but let’s call these good enough. I’ve posted snippets of them here for critique and transparency. These are all of the helper functions I wrote.

SunCoords<-read.csv("SunRADec2017.csv") #get sun coordinates for any year, expressed in decimal degrees CalcSunRiseSetTime <- function(lat, day, altitude, sunset=TRUE){ #calculates time sun is at altitude. Default is sunset=descending sun. d<-day+6208.5 #6208.5 days from Jan 1 2000 noon to Jan 1 2017 0:00, which is the year I'm using here ST0<-GMT2GMSThours(d)*15 #calculate sidereal time at long=0 at midnight, expressed in degrees #use ifelse so function is vectorized... #modulus handles discontinuity near equinox EoT<-((SunCoords[day,]$RADegrees - ST0 - 0) / 15)%%24 #"Equation of time" at long=0, expressed in hours #semi-diurnal arc cosH<-(sin(altitude/57.2957795) - sin(lat/57.2957795)*sin(SunCoords[day,]$DecDegrees/57.2957795)) / (cos(lat/57.2957795) * cos(SunCoords[day,]$DecDegrees/57.2957795)) #handle cases near poles: if cosH >1, return ~11:59pm, as sun never rises. if <-1, return ~12:01am, as sun never sets if(sunset){ r<-ifelse(cosH > 1, 0.001, ifelse(cosH < -1, 23.999, EoT + (acos(cosH)*57.2957795/15))) } else{ r<-ifelse(cosH > 1, 23.999, ifelse(cosH < -1, 0.001, EoT - (acos(cosH)*57.2957795/15))) } r #time in decimal hours } CalcObjectRiseSetTime <- function(lat, day, RA, Dec, altitude, set=TRUE){ #RA and Dec of object in degrees cosLHA<-(sin(altitude/57.2957795) - sin(lat/57.2957795)*sin(Dec/57.2957795)) / (cos(lat/57.2957795) * cos(Dec/57.2957795)) if(set){ if(cosLHA > 1) return(0.001) if(cosLHA < -1) return(23.999) SiderealAnswer<-RA + (acos(cosLHA)*57.2957795) #in degrees } else{ if(cosLHA > 1) return(23.999) if(cosLHA < -1) return(0.001) SiderealAnswer<-RA - (acos(cosLHA)*57.2957795) #in degrees } #convert Greenwich sidereal time to GMT GMST2GMT(day, SiderealAnswer/360*24) } GMST2GMT<- function(day, ST){#input day of 2017 and sidereal time in hours ***why is it up to 8 minutes off? #return value in hours ((ST/24-0.277-day*0.00273042)*24)%%24 #2017 starts at 0.277 day sidereal time, declines by 3m56s each day } GMT2GMSThours<- function(GMTday){ #input days and fraction since J2000.0 (18.697374558 + 24.06570982441908 *GMTday)%%24 #return sidereal time at long=0, expressed in hours } TimeInNightSky <- function(lat, day, RA, Dec, MinObjectAltitude, SunAltitude){ # ObjectRise<-CalcObjectRiseSetTime(lat, day, RA, Dec, MinObjectAltitude, FALSE) ObjectSet<-CalcObjectRiseSetTime(lat, day, RA, Dec, MinObjectAltitude, TRUE) SunRise<-CalcSunRiseSetTime(lat, day, SunAltitude, sunset=FALSE) SunSet<-CalcSunRiseSetTime(lat, day, SunAltitude, sunset=TRUE) if(ObjectRise > ObjectSet){ time<-min(ObjectSet, SunRise) + 24 - max(ObjectRise, SunSet) } else{ time<-max(0, SunRise-ObjectRise) + max(0, ObjectSet-SunSet) } time } CalcEoT <- function(day){ d<-day+6208.5 #6208.5 days from Jan 1 2000 noon to Jan 1 2017 0:00, which is the year I'm using here ST0<-GMT2GMSThours(d)*15 #calculate sidereal time at long=0 at midnight, expressed in degrees #use ifelse so function is vectorized... EoT<-((SunCoords[day,]$RADegrees - ST0 - 0) / 15)%%24 #"Equation of time" at long=0, expressed in hours #EoT<-ifelse(abs(EoT) > 12, EoT+24, EoT) #handle discontinuity near vernal equinox EoT } #calculate map data for one night at one location lati<- 25 #latitude to use minAlt<-20 #lowest imageable altitude lowDec<-max(-89, lati-90+minAlt) #only compute values for Dec within range of imageability for latitude highDec<-min(89, 90+lati-minAlt) decSequence<-seq(lowDec,highDec,2) RASequence<-seq(1,365,2) timegrid=data.frame(matrix(0, nrow=180, ncol=365)) for (r in RASequence){ for(d in decSequence){ timegrid[d+90, r]<-timegrid[d+90, r] + TimeInNightSky(lati, 105, r, d, minAlt, -15) #-15 is position of sun at astronomical twilight } } #now convert the dataframe above into a format readable by the plotting code timemap2 <- data.frame(RA=numeric(), Dec=numeric(), time=numeric() ) for (x in RASequence){ #only look where values were loaded into the matrix .. old version was "1:ncol(timegrid)){" for(y in decSequence+90){ #1:nrow(timegrid)){ temp<-data.frame(x, y-90, timegrid[y,x]) #the +/-90 is to handle the negative Dec values, which are invalid indexes in a matrix cols <- c("RA","Dec","time") colnames(temp) <- cols timemap2<-rbind(timemap2, temp) } } #flip the RA axis timemap2$RA<-365-timemap2$RA

With these functions, R can easily plot a few interesting things to verify the code makes sense. One is the Equation of Time.

Sunrise and sunsets (astronomical twilight times, actually) for a year look pretty good, too. This is sunrise and sunset for every day of the year at 25° N, not accounting for Daylight Savings Time of course.

The functions appear to work well, so I iterated those functions to calculate a single night sky (April 15^{th}) from London, to see what parts get the most image-able hours.

#calculate map data for one night at one location lati<- 25 #latitude to use minAlt<-20 #lowest imageable altitude lowDec<-max(-89, lati-90+minAlt) #only compute values for Dec within range of imageability for latitude highDec<-min(89, 90+lati-minAlt) decSequence<-seq(lowDec,highDec,2) RASequence<-seq(1,365,2) timegrid=data.frame(matrix(0, nrow=180, ncol=365)) for (r in RASequence){ for(d in decSequence){ timegrid[d+90, r]<-timegrid[d+90, r] + TimeInNightSky(lati, 105, r, d, minAlt, -15) #-15 is position of sun at astronomical twilight } } #now convert the dataframe above into a format readable by the plotting code timemap2 <- data.frame(RA=numeric(), Dec=numeric(), time=numeric() ) for (x in RASequence){ #only look where values were loaded into the matrix .. old version was "1:ncol(timegrid)){" for(y in decSequence+90){ #1:nrow(timegrid)){ temp<-data.frame(x, y-90, timegrid[y,x]) #the +/-90 is to handle the negative Dec values, which are invalid indexes in a matrix cols <- c("RA","Dec","time") colnames(temp) <- cols timemap2<-rbind(timemap2, temp) } } #flip the RA axis timemap2$RA<-365-timemap2$RA

Then I plotted the resulting data (using code not shown):

For comparison, I also plotted London’s sky at the winter and summer solstices:

You can see that in the winter, London provides plenty of dark sky time (though it’s rarely clear), but near the summer solstice the sun never gets 15° below the horizon!

For comparison, let’s calculate the same chart for Miami (at 25° N) on April 15^{th} to see how things change. (Because the Earth is rotating, longitude is irrelevant to our calculations.)

You can see how the longer spring nights in Miami, compared to London, give you more dark sky time this time of year. The opposite is true for winter, where the nights are looooong in London.

Now that I had one night, it was time to sum all of the nights for a year. Even on a fairly fast computer, this took some time, so I had to go back and see if I could make the code more efficient. R is an interpreted language, but highly optimized for vectors. Unfortunately, this code required several “for” loops, which R does not handle quickly. There is probably an elegant and fast solution with dplyr or something, but I couldn’t make it work. Using a resolution of 2×2° squares, it still took 40-80 minutes to crunch the numbers for each map, depending on latitude.

Here is the map from 50° N.

How do the maps look near the equator? And what about near the poles? Here are maps at 70°N, 20°N, and the equator.

You can see that the seasonal changes are greater the further you move away from the equator.

Now that everything was working, I generated full-sky data for latitudes of 80° N down to 80° S in 2.5° increments. This took about 60 hours of computer time (mostly while I was sleeping or at work). I’m sure there is a more elegant and faster way to do it in R, but again, I used for loops. This gave me the data I needed to calculate the “average” view from everywhere on Earth.

###avoid rbind by using a matrix, then re-shape the matrix back into a dataframe the plotting code can read for(lati in seq(-80, 80, 2.5)) { minAlt<-20 #lowest imageable altitude lowDec<-max(-89, lati-90+minAlt) #only compute values for Dec within range of imageability for latitude highDec<-min(89, 90+lati-minAlt) decSequence<-seq(lowDec,highDec,2) RASequence<-seq(1,365,2) timegrid=data.frame(matrix(0, nrow=180, ncol=365)) for (r in RASequence){ for(d in decSequence){ for(date in 1:365){ timegrid[d+90, r]<-timegrid[d+90, r] + TimeInNightSky(lati, date, r, d, minAlt, -15) #-15 is position of sun at astronomical twilight } } } #get the average timegrid<-timegrid/365 #covert the grid into a dataframe of the right format to be read by mapping code below timemap2 <- data.frame(RA=numeric(), Dec=numeric(), time=numeric() ) for (x in RASequence){ #only look where values were loaded into the matrix .. old version was "1:ncol(timegrid)){" for(y in decSequence+90){ #1:nrow(timegrid)){ temp<-data.frame(x, y-90, timegrid[y,x]) #the +/-90 is to handle the negative Dec values, which are invalid indexes in a matrix cols <- c("RA","Dec","time") colnames(temp) <- cols timemap2<-rbind(timemap2, temp) } } #flip the RA axis, since map is printed from 24 to 0 hrs timemap2$RA<-365-timemap2$RA #save as a separate file fname=paste("NightMap Lat=", lati, ".Rda", sep="") save(timemap2, file=fname) }

Bringing all of my maps together, this is the view from everywhere on earth (technically, everywhere between latitudes 80° N and 80° S):

Two things stand out:

- The contours are a little jagged. This is a result of running the calculations for latitudes at 2.5° increments. Increasing the “latitude resolution” would further smooth them.
- The hemispheres aren’t completely symmetrical. Summer in the southern hemisphere seems to have a slightly greater impact on the southern night sky than the northern hemisphere’s summer does on the northern sky. At first I thought it could be an error in my equations (and it could indeed be). But I think it’s the effect of Earth’s elliptical orbit. The changes in sunlight on the days near the solstices are not symmetrical (see the shape of the analemma for proof). Someone more skilled in positional astronomy could surely offer a clearer explanation or show me I’m wrong.

All in all, I think it’s a pretty neat map. The winter sky clearly gets more time in darkness, and the southern hemisphere has more extreme differences between the summer and winter. Surprisingly, the area of sky with the least good imaging time on average is just south of the most-imaged object in the night sky: M42 in Orion. And the area with most imaging time available on average is the stinger of Scorpius, where there are a slew of beautiful objects.

But we can take this even further. You may object that people are not evenly distributed by latitude. And indeed, that is why you see so many images of M42 compared to the Cat’s Paw Nebula, which gets more hours in darkness. A proper map of “image-able” time for the night sky would incorporate where people actually live by latitude. So let’s account for where people live. First, I had to get the data, which had to be distilled from a much larger data set.

The data on population from SEDAC* come in a raster format, so it took a little more code to get a simple percentage of population at each latitude. (Note that in the code here, I cut the data slightly differently, so the bins would be centered on 5° increments. )

library(rgdal) library(sp) library(raster) GPW<-raster("gpw-v4-population-count-adjusted-to-2015-unwpp-country-totals_2015.tif") popPerLatDegree<-aggregate(GPW, c(43200,1200), fun=sum) #aggregate data across longitudes and into 2.5 degree groups by latitude popPerLatDegree<-as.data.frame(popPerLatDegree, xy=TRUE) names(popPerLatDegree)[3] <- "pop" #shorten column name popPerLatDegree$x<-NULL #we don't need the longitude, since there's only one value now popPerLatDegree[is.na(popPerLatDegree$pop),]$pop<-0 #replace the NA's with zeros popPerLatDegreeTrimmed<-popPerLatDegree[3:142,] #trim top two degrees off so center of each group of five degrees (approximately) aligns with 0,5,10,etc. #sum every five degrees into one value popPerFiveDegree <- data.frame(centralLat=numeric(), pop=numeric() ) for(n in seq(1, nrow(popPerLatDegreeTrimmed), 5)) { total<-0 for(y in 0:4) { total<-total + popPerLatDegreeTrimmed[n+y,]$pop } tmp<-data.frame(popPerLatDegreeTrimmed[n+2,]$y, total) cols <- c("centralLat","pop") colnames(tmp) <- cols popPerFiveDegree<-rbind(popPerFiveDegree, tmp) } #calculate proportion in each band worldPop<-sum(popPerFiveDegree$pop) popPerFiveDegree$pop<-popPerFiveDegree$pop/worldPop

The following table shows the percent of the world’s population by latitude:

A couple of things are immediately apparent:

- Only 13% of the world’s population lives in the southern hemisphere.
- 49% of the world’s people live in the range from 20 to 40° N.

Okay, now that we know where people live, we can re-create the sky map weighted by where people live.

Sadly, it turned out to be pretty uninteresting. The results are simpler than I first thought because the distribution of human population is so skewed that its effect swamps any seasonality. The final plot mostly consists of data from the mid-northern latitudes, with only a tiny contribution from other latitudes. The further south you go in declination, the fewer people there are to see it. The short answer to my original question is that the most and least image-able parts of the sky is basically, “near the north and south celestial poles, respectively.”

* Population data from: Center for International Earth Science Information Network – CIESIN – Columbia University. 2016. Gridded Population of the World, Version 4 (GPWv4): Population Count Adjusted to Match 2015 Revision of UN WPP Country Totals. Palisades, NY: NASA Socioeconomic Data and Applications Center (SEDAC). http://dx.doi.org/10.7927/H4SF2T42. Accessed 20 April 2017.

## The Insight Astronomy Photographer of the Year Exhibition

It’s rare that you see much about astrophotography in the popular press or in museums, but this weekend I had the chance to see the exhibit for The Insight Astronomy Photographer of the Year contest. This contest is in its fifth year, drawing entries from around the world. The winners are not only published in a beautiful yearbook, but they are also exhibited at the Royal Observatory in Greenwich, UK. The exhibit is part of the free area, though if you are into astronomy, you may want to pay the fee to see the nearby observatory museum, where luminaries like Airy and Flamsteed worked. (Of course, you also get to stand on the prime meridian line, which is pretty cool too.)

As you can see the exhibit was popular with the public, which is heartening to see.

The winning images were printed and displayed on lightboxes, which really made them look fantastic. They also had a few video clips with interviews of the photographers playing. A few close-ups are below.

Overall, there seemed to be a bias toward nightscape images over deep-sky images, which makes sense, as it gives people a sense of their place in the universe, but I would personally like to have seen more. Two deep-sky images (not shown above), by Pavel Pech and Rolf Wahl Olsen, were particularly impressive.

Each image listed the technical details and explained what you were looking at. Like the book, the exhibit quality was very high, and I know I came away inspired to get out with my camera more.

## Lecture: Understanding Signal, Noise, and Resolution

Last night, I was honored to present a lecture to the New Jersey Astronomical Association’s imaging meeting. It’s a great group of people, and I’m sorry that I’m only connecting with them now. Special thanks to Mike Franzyshen and Jim Roselli for the invitation.

In about an hour, I cover broad range of related topics:

- The statistics of shot noise, including the connections between Simeon-Denis Poisson and the particle theory of light
- Signal-to-noise ratio, with examples
- The effects of skyglow, with examples
- Resolution and sampling
- Aperture, focal length, and focal ratio

It was a lot of fun to put this lecture together, and during the research for it, I uncovered how Poisson’s life led to multiple insights about photons, none of which he was able to appreciate during his lifetime.

NOTE: At 40:30, I second-guessed myself, but the math on the slide is correct as written. The additional read noise is indeed 32. Since read noise is 10 per exposure, per the previous slide the additional noise is: SQRT(10^2 + 10^2 + 10^2 + 10^2 + 10^2 + 10^2 + 10^2 + 10^2 + 10^2 + 10^2) = 32.