SkyShot - Volume 1, Issue 1: Autumn 2020
The inaugural issue of SkyShot, an online publication for promoting understanding and appreciation for outer space. As an international community, we share the work of undergraduate and high school students through a multidisciplinary, multimedia approach. Features research papers, astrophotography, informative articles, guides, and poetry in astronomy, astrophysics, and aerospace.
The inaugural issue of SkyShot, an online publication for promoting understanding and appreciation for outer space. As an international community, we share the work of undergraduate and high school students through a multidisciplinary, multimedia approach. Features research papers, astrophotography, informative articles, guides, and poetry in astronomy, astrophysics, and aerospace.
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Autumn 2020
Volume I, Issue I
skyshotonline.com
SkyShot
Astronomy Made Accessible
The History and Potentially
Imminent Explosion of
Betelgeuse p. 10
Research: Analysis of
Supernova SN 2019hyk p. 12
Computational Astrophysics
Advancements of 2020 p. 39
Advancements in Aerospace p. 44
Feature
Comet NEOWISE
Recap and
Astrophotography
p. 8
Guide
Astrophotography
on the Cheap: A
Guide
p. 20
Poetry
“method
of gauss”
“starry dreams”
p. 51
1
SkyShot Autumn 2020
Letter from the Editing Team
On behalf of the editing team, welcome to the first publication of SkyShot! Our
goal with this magazine is to launch a platform where students can share their spacerelated
works with a wide audience. In this inaugural issue, we are excited to spotlight
some great work that our contributors have created, with a wide range of topics
covering research, articles, poetry, astrophotography, and more. We sincerely thank all
contributors for sharing their passion and hard work with us. We hope you enjoy this
season’s publication, and we look forward to growing this platform with all of you!
Happy reading!
The SkyShot Team
Editing Team
Founder: Priti Rangnekar
President: Rutvik Marathe
Executive Directors:
Naunet Leonhardes-Barboza
Victoria Lu
Carter Moyer
Vighnesh Nagpal
Ezgi Zeren
Head of Astrophysics: Andrew Tran
Head of Content Management: Anavi Uppal
Head of Peer Review: Alexandra Masegian
Peer Reviewer: Feli X
Front and back cover
photo:
Rosette Nebula (NGC
2244)
© Jonah Rolfness
2
SkyShot Autumn 2020
Founder’s Note
Against all odds amidst a pandemic, 2020 has been a remarkable year for
astronomy, astrophysics, and aerospace alike. As the first crewed launch from U.S.
soil in nearly a decade, the Crew Dragon Demo-2 ushered in high spirits. Around
the world, astronomers embarked on a quest to capture the magnificence of Comet
NEOWISE. During the autumn, three phenomenal scientists earned the Nobel
Prize in Physics for their research in black holes and cosmology. Time and time
again, the space sciences prove their unique worth as a force for uniting humanity.
Simultaneously, I was inspired by the sheer diversity of space-related endeavors
pursued by our young generation. As my exoplanet research group pored
over online transit data, the essence of teamwork transcended boundaries of
time zones or location. I saw sparks of joy whenever fellow Summer Science
Program alumni discussed advice about imaging nebulas or shared their spaceinspired
arts. It was clear that students deserved a platform that showcased their
efforts in space science while serving as a welcoming, nurturing community.
An initiative consisting of five projects, “Science Connect” was founded with
the mission of amplifying multidisciplinary education through opportunities in
hands-on problem-solving and communication. SkyShot became our flagship
project as a stunning paragon of collaboration and fusion of disciplines.
After all, the universe is not based upon arbitrary divisions in academic fields - it
simply exists, and we strive towards appreciation and understanding. Outer space
presents an unparalleled juxtaposition of “science” and “art.” On one hand, the
night skies inspire humility and awe at the vast expanse above. On the other, we
are compelled to ask fundamental questions regarding astrophysical processes
and embrace the final frontier through engineering advances. Above all, our innate
curiosity and connection with outer space drive humankind to reach its full potential.
As you read SkyShot, immerse yourself in the multifaceted nature of the cosmos.
Engaging with the wonders of outer space is for everyone, regardless of demographics,
background, or academic interest. I hope you can find and reflect upon your unique
avenues of doing so. Together, we will launch into a new era of unbridled intellectual
growth and exploration.
- Priti Rangnekar
3
SkyShot Autumn 2020
p. 22
p. 28
p. 46
p. 48
4
SkyShot Autumn 2020
6 Astronomical Sightseeing in 2020
Abby Kinney
8 Comet NEOWISE and NEOWISE Astrophotography
Victoria Lu, Anavi Uppal, and Owen Mitchell
10 The History and Potentially Imminent Explosion of Betelgeuse
Alexandra Masegian
12 Photometric and Spectroscopic Analysis of the Type II-P Supernova SN 2019hyk
Sofia Fausone, Timothy Francis Hein, Anavi Uppal, and Zihang Wang
16 On Methods of Discovering Exoplanets
Vighnesh Nagpal
20 Astrophotography on the Cheap: A Guide
Cameron Woo
22 Astrophotography Gallery
Ryan Caputo, Owen Mitchell, Jonah Rolfness, Nathan Sunbury, Anavi Uppal, Cameron Woo, Wilson Zheng
28 Removing Noise and Improving the Quality of Astronomical Observations with Fourier Transforms
Ezgi Zeren
39 Computational Astrophysics Advancements of 2020
Priti Rangnekar
44 Advancements in Aerospace
Rutvik Marathe
46 Access to Space
Carter Moyer
In This Issue
48 Understanding the Chronology of the Universe
Andrew Tran
51 2002 KM6 (99795)
Naunet Leonhardes-Barboza
51 method of gauss
Naunet Leonhardes-Barboza
52 tessellated constellations
Alex Dong
53 starry dreams
Alex Dong
54 images of the past
Naunet Leonhardes-Barboza
55 unseen skies
Alex Dong
56 Astrophotography Index
59 Educational Opportunities
60Contributor Biographies
5
SkyShot Autumn 2020
Astronomical
Sightseeing in
2020
Abby Kinney
Mars at its Best
This autumn has been a great time to view Mars. On October
13th, the planet reached opposition, meaning it appeared opposite
the Sun in the sky, reaching its highest point in the sky at
midnight.
However, Mars was a spectacular sight earlier this season as
well. Mars dramatically increased brightness, eventually surpassing
Jupiter’s brightness on September 24th, and reached a magnitude
of -2.6 at opposition. Additionally, the apparent size of
Mars’ disk was approximately 22 arcseconds [2].
Perhaps you’re reading this in disappointment remembering
the last Mars opposition in 2018 when Mars reached a magnitude
of -2.8 and disk size of about 24 arcseconds [1].
But for Northern Hemisphere observers this 2020 opposition
had something else going for it: Mars’s altitude in the sky was
much higher. In 2018, Mars appeared in the constellation of Capricornus.
But this year, Mars appeared in Pisces, which is further
North in the sky [2].
Whether you were viewing the red planet with your eyes, binoculars,
or telescopes, it fulfilled its promise of being a breathtaking
fall sight.
Source: EarthSky Community Photos.
Citations
[1] Dickinson, David. “Enter the Red Planet: Our Guide to
Mars Opposition 2018.” Universe Today, 18 July 2018, www.universetoday.com/139420/enter-the-red-planet-our-guide-to-marsopposition-2018/.
[2] Lawrence, Pete. “Mars Opposition 2020: Get Ready to Observe
the Red Planet.” Skyatnightmagazine, 24 Aug. 2020, www.
skyatnightmagazine.com/advice/skills/how-to-observe-mars/.
[3] Pasachoff, Jay M. Stars and Planets. Houghton Mifflin, 2000.
Halloween’s Blue Moon
Everyone’s heard the phrase “once in a blue moon,” but
this Halloween, we had a chance to see such a moon. However,
the term “blue moon” is actually quite misleading.
First, blue moons are not necessarily blue. Rather than indicating
a color, a “blue” moon usually refers to the second
full moon of a calendar month [1].
But this now-common definition was actually a mistake
created by the misinterpretation of the Maine Farmer’s Almanac
in a 1946 Sky and Telescope article titled “Once in a
Blue Moon.” The mistake was then popularized by a reading
of this article on the radio program StarDate. Eventually,
this new definition even found its way into the popular
game Trivial Pursuit [2].
The Maine Farmer’s Almanac defined a “blue moon” as
the third full moon of a season with four full moons. This
may seem like a distinction without a difference, but if we
look at this Halloween blue moon, it was actually the second
full moon of Autumn. Thus, by the almanac’s definition,
it would not be a full moon. In fact, the next seasonal
blue moon is in August of 2021, despite that month having
only one full moon.
In addition to lacking their eponymous blue hue, blue
moons are not as uncommon as the ubiquitous phrase
would suggest. Because most calendar months are longer
than the moon’s approximately 29.5 day synodic period
(the period of its phases), a blue moon can occur if the first
full moon was sufficiently early in the month. Thus, a blue
moon typically happens every two to three years and will always
occur on either the 30th or 31st [1]. While blue moons
in general are fairly common, a blue moon on Halloween is
more special: the last blue moon to fall on October 31st was
in 1974, and it won’t happen again until 2039 [4].
In light of the surprising frequency of blue moons, you
may be wondering how the common refrain “once in a blue
moon” came to indicate something rarely occurring; however,
the refrain actually predates our modern usage. It was
used to indicate something that was as absurd as the Moon
turning blue. Over time, the phrase came to mean something
similar to never or very rarely [2].
But the moon can actually appear bluish. While the light
from the moon is reflected from the Sun and, therefore,
virtually identical to daylight, atmospheric conditions can
change the appearance of the moon [3]. For instance, after
the eruption of the volcano Krakatoa in 1883, volcanic ash
in the air was the perfect size to scatter the redder colors of
the moonlight, giving the moon a bluer tint [1].
So whether you were handing out candy or trick-ortreating
yourself this year, that not-so-blue blue moon was
definitely worth looking at.
6
SkyShot Autumn 2020
space/everything-you-need-to-know-geminid-meteor-shower.
[3] Sky Events Calendar by Fred Espenak and Sumit Dutta
The Great Conjunction
of 2020
The full moon on October 31, 2020, Hungary. Source: USA
Today.
Citations
[1] Dunbar, Brian. “Blue Moon.” NASA, NASA, 7 July 2004,
www.nasa.gov/vision/universe/watchtheskies/07jul_bluemoon.html.
[2] Hiscock, Philip. “Blue Moons - Origins and History of the
Phrase.” Sky & Telescope, 20 Apr. 2020, skyandtelescope.org/
observing/celestial-objects-to-watch/once-in-a-blue-moon/.
[3] O’Meara, Stephen James. “The Blue Moons.” Astronomy.com,
12 Sept. 2016, astronomy.com/magazine/stephen-omeara/2016/09/the-blue-moons.
[4] Sky Events Calendar by Fred Espenak and Sumit Dutta
(NASA’s GSFC)
A Moonless Meteor
Shower
Every year, around mid-December, stargazers are treated
to the Geminid Meteor Shower. This meteor shower is
thought to be caused by the Earth crossing a swath of small
particles leftover from comet 3200 Phaethon. The particles
entering the atmosphere comprise the wonderful meteor
shower that we witness each year [2].
This year, however, is a special experience, because the
maximum of the 2020 Geminid Meteor Shower falls on December
14th, which nearly coincides with the new moon.
This means that there will be no moon in the night sky to
interfere with seeing the fainter meteors. During the maximum,
60 to 120 meteors may be seen each hour. As a general
rule, the higher the two bright stars, Castor and Pollux of
Gemini, are in the sky, the more meteors will be seen. This
is because these stars in Gemini are near the radiant of the
Geminid Meteor Shower, which is the point in the sky that
appears to be the origin of the meteors [2].
This year’s moonless Geminid Meteor Shower has the potential
to be a spectacular way to spend a December night. So
get outside and look up!
Citations
[1] Pasachoff, Jay M. Stars and Planets. Houghton Mifflin,
2000.
[2] McClure, Bruce, and Deborah Byrd. “This Weekend’s
Geminid Meteor Shower.” EarthSky, 2019, earthsky.org/
All summer long, the two largest planets of our solar system,
Jupiter and Saturn, have shared the evening sky. But
on December 21st, the two planets will reach conjunction,
meaning they will share the same right ascension. While
the actual time of conjunction will occur at 13:22 UTC, the
planets will appear very close all night and can be observed
getting closer over the course of several days leading up to
their conjunction. But at their closest, they will be separated
by only 6 arcminutes, roughly a fifth the apparent diameter
of a full moon.
Conjunctions of Jupiter and Saturn are not rare, per se:
they occur approximately every 20 years due to the difference
in the orbital periods of Saturn and Jupiter. Saturn,
being farther from the Sun, takes about 30 years to orbit,
while Jupiter takes about 12 years. This means that Jupiter
travels about 30 degrees in the sky every year, while Saturn
travels only 12 degrees [2]. This difference in rates means
every year Jupiter gains about 18 degrees on Saturn. Consequently,
it takes approximately 20 years for Jupiter to “lap”
Saturn. Thus, there are approximately 20 years in between
conjunctions of Saturn and Jupiter. However, this year’s
conjunction is truly special: Jupiter and Saturn have not appeared
so close in the sky since 1623 and will not again until
2080.
With the naked eye, Jupiter and Saturn may appear to be
one point of light, but with just a small pair of binoculars
they can be resolved into two small disks with Jupiter being
the brighter of the two. With a telescope, many more
features can be seen, including cloud bands on Jupiter, the
rings of Saturn, Jupiter’s Galilean moons (Io, Europa, Callisto,
and Ganymede), and Saturn’s largest moon, Titan [1].
The location of Jupiter and Saturn low in the southwestern
sky may make viewing difficult, especially if there are
trees or other obstructions in that region. Additionally, the
planets will only be visible for a short time after sunset [1].
In spite of these challenges, this year’s great conjunction is
an historic one and the meeting of these two majestic planets
will be a sight worth seeking out.
Citations
[1] Money, Paul. “Upcoming Conjunctions in the Night
Sky, and How to See Them.” Skyatnightmagazine, 26 Aug.
2020, www.skyatnightmagazine.com/advice/skills/conjunctions-in-night-sky-how-see/.
[2] Etz, Donald V. “Conjunction of Jupiter and Saturn.”
Journal of the Royal Astronomical Society of Canada, vol.
94, Aug. 2000, pp. 174–178.
[3] Pasachoff, Jay M. Stars and Planets. Houghton Mifflin,
2000.
7
SkyShot Autumn 2020
Comet NEOWISE
Victoria Lu
July of 2020 was a notable month for stargazers as Comet
NEOWISE appeared in night skies. The comet was the brightest
to appear in the Northern Hemisphere in decades and was
visible to the naked eye.
Comet NEOWISE was discovered on March 27, 2020 by
astronomers using NASA’s Wide-field Infrared Survey Explorer
(WISE) telescope. WISE launched in 2009 and utilizes infrared
wavelengths to search for galaxies, cold stars, and near-earth
objects such as asteroids and comets. The telescope’s infrared
channels detected the heat signatures of the comet. From its
signature, scientists determined that the comet was about 5 kilometers
across and covered with sooty particles leftover from
the birth of the solar system [2].
Comets are balls of frozen gas, dust, and rock that orbit the
sun [4]. When a comet such as NEOWISE nears the sun, the
increased heat forms a coma around the nucleus. The coma—
which can be considered an atmosphere composed of particles
and gases—is blown to form a tail. Comet NEOWISE has
been observed to have both an ion tail and a dust tail. A dust
tail forms when dust within the nucleus is forced out by solar
radiation pressure. An ion tail, on the other hand, is birthed
when ultraviolet radiation forces electrons to eject from the
coma. The particles ionize and form a plasma, which interacts
with high solar winds to form a tail. Such tails can stretch for
millions of miles [3].
NEOWISE was not considered a “great comet” (an exceptionally
brilliant comet with a long tail), but it dazzled viewers
nonetheless. Its brightness enabled viewers to observe it with
the naked eye in dark locations, or with the assistance of binoculars
and telescopes [1]. Comet NEOWISE’s proximity to the
Big Dipper made it easier for inexperienced viewers to spot.
NEOWISE was difficult to see in areas with high light pollution
but was still visible after some persistence.
On July 22, the comet reached perigee, passing as close to
the Earth as it would come at a distance of 103.5 million km.
Comet NEOWISE had been fading since it reached perihelion
(the closest point to the sun) on July 3rd, but its approach to
Earth made the dimming less noticeable. After reaching perigee,
however, the comet steadily dimmed as it grew further
from Earth. On July 4th the comet had a magnitude of +1.6, but
in subsequent weeks it dimmed, becoming an 8th magnitude
object towards the end of mid-August. Magnitude is a measure
of brightness, with higher numbers signifying dimmer objects
[5].
Once gone, this comet will not appear for another 6,800
years. One fact is for certain—Comet NEOWISE put on a dazzling
display for viewers worldwide before bidding goodbye.
Citations
[1] Furfaro, E. (2020, July 14). How to See Comet
NEOWISE. Retrieved August 25, 2020, from https://
www.nasa.gov/feature/how-to-see-comet-neowise
[2] Hartono, N. (2020, July 08). Comet NEOWISE
Sizzles as It Slides by the Sun. Retrieved August 25,
2020, from https://www.nasa.gov/feature/jpl/cometneowise-sizzles-as-it-slides-by-the-sun-providing-atreat-for-observers
[3] Isaacs-Thomas, I. (2020, July 28). How to spot
Comet NEOWISE before it disappears for thousands
of years. Retrieved August 25, 2020, from https://
www.pbs.org/newshour/science/how-to-spot-comet-neowise-before-it-disappears-for-thousands-ofyears
[4] NASA. (n.d.). Comets. Retrieved August 25,
2020, from https://solarsystem.nasa.gov/asteroids-comets-and-meteors/comets/overview/?page=0
[5] Rao, J. (2020, July 24). The curtain is about to
come down on Comet NEOWISE. Retrieved August
25, 2020, from https://www.space.com/comet-neowise-is-dimming.html
Comet NEOWISE, as observed in mid-July
from Bozeman, Montana.
© Owen Mitchell
8
SkyShot Autumn 2020
Comet NEOWISE (Zoomed), as observed on July 19 in Orlando, Florida.
© Anavi Uppal
Comet NEOWISE (wide), as observed on July 19 in Orlando, Florida.
© Anavi Uppal
9
SkyShot Autumn 2020
The History and Potentially
Imminent Explosion of
Betelgeuse
Alexandra Masegian
The changing surface of fading Betelgeuse. Source: NASA, 2020.
Sitting high in the left shoulder of the Orion constellation,
the massive star known as Betelgeuse is
nearing the end of its life. As one of the largest and
most luminous stars in the sky, Betelgeuse has been
identified as an M1-2 type red supergiant star [1].
When it dies, it will undergo a catastrophic process
known as a supernova, flinging its outer layers into
space in a massive explosion and leaving behind a
core so dense that it will likely become a black hole.
The only question is when that supernova will occur
— a question that, over the course of the past
year, astrophysicists thought was on the verge of
being answered.
Betelgeuse is known to be a semi-regular variable
star, meaning that its brightness fluctuates periodically
on multiple timescales. Because it is so bright,
these fluctuations are often visible to the naked eye
and were noticed even by the earliest of astronomers.
In fact, the first recorded instance of Betelgeuse’s
variability dates back to 1836. The event was
described in Sir John Herschel’s 1849 Outlines of
Astronomy, who wrote, “The variations of Alpha
Orionis, which were most striking and unequivocal
in the years 1836-1840, within the years since
elapsed became much less conspicuous” [2]. By
piecing together observations like Herschel’s and
more modern data, astrophysicists have deduced
that Betelgeuse primarily pulsates on a timescale of
almost 425 days, with secondary periods of 100-180
days and 2,153 days (5.9 years) [3]. This means that
slight fluctuations in the star’s brightness are both
commonplace and expected — within the confines
of the expected pattern, of course.
Over the course of the past year, however, the
massive star’s brightness has been fluctuating in
ways that defy this usual pattern. In October of
2019, Betelgeuse began to dim at a point in its cycle
where it normally would have been bright. Though
the change was not noticeable at first, by December
of 2019 the supergiant had lost over two-thirds
of its usual brilliance, enough to reduce it from
one of the top ten brightest stars in the sky to the
twenty-first. [3] The star’s dimming was the most
severe since precise monitoring of its atmosphere
began nearly a century ago, and it lasted much longer
than would be normal for the star’s typical cycle.
(If the dimming was just a product of two of the
star’s cycles overlapping at their minimum points,
for instance, it would have only lasted a few weeks
rather than several months.) Astronomers around
10
SkyShot Autumn 2020
the world began to take notice. Was Betelgeuse
on the verge of supernova?
Little is known about how massive
stars behave in the years leading up to
their explosive deaths. Though historical
evidence indicates that humanity has
witnessed supernovae before, the last
such event to occur in our galaxy was in
1604, long before the advent of modern
telescopes and observing technology. [4]
Even though we have never been able to
observe a star during its final moments
before the supernova, astrophysicists
have made predictions as to what some
of the early warning signs could be. One
of those possible signs is what astrophysicist
Sarafina Nance calls “insane and violent
mass loss.” [5] In theory, a dying
star will shoot a large portion of its mass
into space right before its death, which
could cause the star to appear dimmer
as clouds of ejected dust block its light
from reaching the Earth. Betelgeuse’s
sudden and significant dimming, therefore,
seemed to be a sign that the red supergiant
was in the throes of death.
As astronomers scrambled to develop
theories for what could be causing the
star’s abrupt change, Harvard-Smithsonian
astrophysicist Andrea Dupree and
her team turned to the Hubble Space
Telescope, which they started using to
monitor Betelgeuse in early January of
2019. They were able to isolate several
months of ultraviolet-light spectroscopic
observations of the star in the time
leading up to its dimming, and analysis
of the data revealed signs of dense, hot
material moving through the star’s atmosphere
in the months of September,
October, and November 2019. [6] Traveling
at nearly 200,000 miles per hour, the
material continued beyond Betelgeuse’s
visible surface and out into space at
around the same time that the star underwent
its most significant dimming.
Dupree theorizes that once the material
had separated from the incredibly hot
stellar surface, it was able to cool enough
to form a large gas cloud, which would
have blocked a large portion of Betelgeuse’s
light and made it appear much
dimmer to us here on Earth. [6] Meanwhile,
the star’s normal pulsational cycle
continued as usual, the behavior of
the star’s outer atmosphere returning to
normal even as the dust lingered.
It is not yet understood what caused
the stellar outburst, though Dupree and
her colleague Klaus Strassmeier of the
Leibniz Institute for Astrophysics in
Potsdam think that it may have been a
result of the star expanding in its pulsation
cycle at the same time that a pocket
of material underneath the star’s surface
was experiencing an upwelling. The
surge of force accompanying Betelgeuse’s
expansion could have propelled the hot
plasma in the convection cell outward,
providing it with enough momentum
to escape the star’s atmosphere. Though
the star has since stabilized and seems to
have returned to its normal pulsational
cycle and brightness, the mass ejection
that Dupree observed has been found to
contain nearly twice the amount of material
that is typical of a Betelgeuse outburst.
[6] The question of whether or not
it is on the brink of supernova remains
an open one.
When Betelgeuse eventually does
reach the end of its life, the resulting explosion
will be bright enough to be visible
during the day and cast shadows at
night here on Earth. Though our planet
is far enough away from the massive
star to avoid the majority of the radiation
that the supernova will produce, the
afterimage of the supergiant’s death will
linger in the sky for months, serving as a
stark reminder of the vastness and beauty
of our universe. The star’s odd behavior
this past year might not be the signal
we’re looking for to indicate Betelgeuse’s
imminent death, but it certainly is a sign
that the star is growing more unstable
— and, whenever it does finally explode,
its death will undoubtedly be one of the
most exciting astrophysical events of the
millennium.
References
[1] Keenan, Philip C.; McNeil, Raymond
C. (1989). “The Perkins catalog of
revised MK types for the cooler stars”.
Astrophysical Journal Supplement Series.
71:245. DOI: 10.1086/191373.
[2] Davis, Kate. Alpha Orionis (Betelgeuse).
American Association of Variable
Star Observers. <https://www.aavso.
org/vsots_alphaori>
[3] King, Bob. (2019 December 21). Betelgeuse
is Dimming… Why? Sky & Telescope.
<https://skyandtelescope.org/observing/fainting-betelgeuse/>
[4] Vink J. (2016) “Supernova 1604,
Kepler’s Supernova, and Its Remnant.”
In: Alsabti A., Murdin P. (eds) Handbook
of Supernovae. Springer, Cham.
DOI:10.1007/978-3-319-20794-0_49-1
[5] Drake, Nadia. (2019 December 26).
A giant star is acting strange, and astronomers
are buzzing. National Geographic.
<https://www.nationalgeographic.
com/science/2019/12/betelgeuse-is-acting-strange-astronomers-are-buzzing-about-supernova/>
[6] Dupree, Andrea et al. (2020). “Spatially
Resolved Ultraviolet Spectroscopy
of the Great Dimming of Betelgeuse.”
The Astrophysical Journal. 899:1. DOI:
10.3847/1538-4357/aba516
11
SkyShot Autumn 2020
Photometric and Spectroscopic
Analysis of the Type II-P Supernova
SN 2019hyk
Sofia Fausone
Healdsburg High School
Timothy Francis Hein
Los Altos High School
Anavi Uppal
Windermere Preparatory School
Zihang Wang
Webb School of California
Dated: August 2, 2019
Abstract
Over a period of four weeks, we performed spectroscopy and BVRI color photometry on SN 2019hyk, a
supernova located in galaxy IC 4397. Using the telescopes at Yale University’s Leitner Family Observatory
and on iTelescope.net, we took images throughout July 2019 to generate a light curve of the supernova.
After evaluating the shape of the light curve and the emission spectrum, we conclude that SN 2019hyk is a
type II-P supernova.
12
The discovery, observation, and analysis of supernovae
provide us with valuable insight into the processes
and physics behind these events. Supernovae are largely
responsible for distributing heavy metals throughout
space, and are used as standard candles for measuring
cosmological distances. Investigations into supernovae
not only deepen our understanding of stellar physics, but
also reveal information about the broader structure of
the universe.
SN 2019hyk is located in the galaxy IC 4397, a type Sbc
galaxy with an absolute magnitude of 13.2 [2]. Discovered
in 1889 in the Bootes constellation, the galaxy is about
203 million light years away from Earth [3].
We used both spectroscopy and photometry to
study the supernova. The former provides information
on emission and absorption lines, and the latter provides
instrumental magnitudes. After color-correcting instrumental
magnitudes to apparent magnitudes, we created
a light curve which provides information on the supernova’s
classification when compared to models of known
supernovae.
The current system for the classification of supernovae
was established in 1940 by Fritz Zwicky and Walter Baade
in the 1940s, and it is beginning to show signs of old age.
The discovery of superluminous supernovae (SLSN) and
their awkward subdivisions under Zwicky and Baade’s
classification system show that this system is outdated.
The different classifications of supernovae do not have
clear relations with each other. In order to remake the
current supernova classification system, it is imperative
that we study more supernovae [5]. To this end, our team
decided to study SN 2019hyk.
Figure 1: SN 2019hyk
SkyShot Autumn 2020
II. Methodology
A. Photometry and Observations
The photometric analysis of SN 2019hyk consists of three
steps: telescopic observing, raw image processing, and leastsquares
fitting color calibration.
We took images with the Leitner Family Observatory’s
16-inch Ritchey-Chretien Telescope, as well as remote telescopes
from iTelescope.net. We used a STL1001E CCD camera
with the 16 inch telescope for all our data except on our
last day of observing, when we used the STT1603ME CCD
camera. To slew the telescope, calibrate, focus, and take images,
we used the SkyX program.
Arcturus was our pointing calibration star throughout the
observations, and we used Tpoint pointing models, with at
least two stars, each session. After checking the position of
Arcturus and making a new pointing model if needed, we
slewed to the supernova. Here, we checked that the star field
was correct and the telescope focused. If we needed to adjust
the focus, we used the RCOS TCC control app to move the
secondary mirror. We took series of images using Johnson V
and Johnson R filters with 1x1 binning and exposure times
of 120 seconds each. We also used remote telescope T21 in
New Mexico to take images of SN 2019hyk in Johnson V and
Johnson R filters with 2x2 binning, for exposure times of 120
seconds.
We processed our images using the software MaxIm DL.
We first flat-fielded our images, and then aligned our images
using auto star matching with bicubic resampling enabled.
We then median combined the images taken in the same filter
and adjusted their light histograms in order to make SN
2019hyk appear with the greatest resolution and contrast.
Using software from Astrometry.net, we plate-solved our
combined V and R filter images to acquire sky coordinates.
Considering that the supernova is visibly distinct from
its galaxy, and little light from the galaxy interferes with the
light captured by our supernova apertures, we elected to not
perform galaxy subtraction. Doing so would vertically shift
our light curve, likely by a marginal amount, and not affect
the shape or size of the curve.
B. Photometric Analysis
We programmed an automated pipeline that extracts
useful information from the combined images to calculate
the standard magnitude of SN 2019hyk. The program first
measures the flux of the supernova by summing up the
pixel values passing through a 14-pixel circular aperture,
and subtracting the average background noise collected by
a 20-pixel annulus. The flux of the supernova is related to
its instrumental magnitude through:
where m is the instrumental magnitude, and b is the
flux. Using this relationship, we can calculate the instrumental
magnitude of the supernova in each combined
image. However, to acquire the supernova’s standard magnitude,
we must also perform a color calibration on the instrumental
magnitudes. After a cross-examination of the
supernova’s magnitudes in both V filter and R filter, we can
correct for the linear shift in magnitudes caused by the
CCD’s color biases.
To perform a linear transform on our instrumental magnitudes,
we need the instrumental magnitudes of the supernova
in both V and R filters. After comparing them to
the standard magnitudes, we perform two least-squares
fittings to acquire the transformation coefficients:
where v and r are instrumental magnitudes, V and R are
standard magnitudes, and Tvr,Cvr,Tv and Cv are the coefficients
we are interested in. The pipeline was fed with
the coordinates, as well as V and R standard magnitudes of
30 calibration stars from APASS (The AAVSO Photometric
All-Sky Survey), as shown in Figure 3.
These calibration stars provide a general correlation
between color and instrumental magnitude. The pipeline
locates these calibration stars using the WCS files of
solve-fielded images and then measures their instrumental
magnitudes with the same aperture and annulus. It then
uses least-squares fitting to calculate the values of transformation
coefficients by making a V - R vs v - r plot and a
V - v vs V - R plot. The color calibration coefficients enabled
us to derive the standard magnitude of SN 2019hyk and
generate a light curve, upon which we could determine the
supernova’s type through a model fitting.
Figure 2: Observation Schedule
13
SkyShot Autumn 2020
III. Results
The instrumental magnitudes v and r, color-calibrated
apparent magnitudes V and R, and errors of V are included
in the table below.
Figure 4: Instrumental and Apparent Magnitudes
Figure 3: 30 APASS Calibration Stars in the
Vicinity of SN 2019hyk
C. Spectroscopy
We took spectra of the supernova using a DSS7 spectrometer
on the 20-inch reflector at the Hotchkiss School
to perform spectroscopy. To minimize background noises
and light pollution, we subtracted the sky image from the
supernova image. We then performed a linear fit on a calibration
spectrum of Arcturus to an archive spectrum to derive
the corresponding wavelengths. Finally, we transformed
the spectrum of the supernova using the redshift of its host
galaxy. Using the formula:
Figure 5 is the color-calibrated light curve for SN
2019hyk in V band. Six apparent magnitudes are plotted
against days since peak magnitude, which occurred on
June 27, 2019, according to our model. With a steady plateau,
the light curve clearly matches the model of a Type
II-P supernova despite the relatively large uncertainties in
the second and last measurements. The most recent observations
indicate that the supernova will continue to
dim in the following month.
This derives a plot of our spectrum that indicates the supernova’s
composition.
D. Error Propagation
We created covariance matrices for each filter’s linear fit
process to calculate the uncertainty of our measurements of
SN 2019hyk’s V and R magnitudes. We then diagonalized
each matrix and took their square roots. From this process,
we obtained two values for each linear fit: the slope uncertainty
and the offset uncertainty. We then plugged these uncertainties
into our linear fit line equations in order to find
the maximum and minimum V and R magnitudes possible
with those uncertainties. These minimum and maximum
magnitudes formed the endpoints of our error bars on our
SN 2019hyk error curve. As the error involved in measuring
the flux of SN 2019hyk and our calibration stars was negligible,
we decided not to include it in our uncertainty calculations.
Figure 5: V Band Light Curve with Type II-P Super-
Figure 6 shows both the V and R light curve of SN
2019hyk. The R magnitudes are generally brighter than the
V magnitudes, with the exception of the second and last
measurements. Just like most supernovae, R magnitude
changes following the trend of V magnitude.
Figure 7 is a spectrum of SN 2019hyk taken by the 20-
inch reflector at the Hotchkiss School. Due to the limited
aperture size, exposure time, and the supernova’s decreas-
14
SkyShot Autumn 2020
ing brightness, there is a lot of noise in the spectrum, even after
sky subtraction. However, it can still be clearly seen that the spectrum
peaks at 656nm, the wavelength of the Hα emission line.
This spectral identity corresponds to the photometric measurements
and further proves SN 2019hyk to be a Type II supernova.
IV. Conclusions
Our research on SN 2019hyk contributes to a more
complete understanding of supernovae and their extragalactic
significance. More specifically, we can use the spectrum
and light curve to identify aspects of the origin and
composition of SN 2019hyk.
SN 2019hyk is a type II-P supernova. This is evident
from both the Hα emission line in its spectrum and the
prolonged period of high luminosity ‘plateau’ following
this peak. SN 2019hyk is the result of the collapse of a giant
star, roughly eight to fifty times as massive as our sun.
Such a star would have passed the silicon burning process
and released abundant heavy elements as it went supernova,
creating the cradle for newborn stars and planets.
Figure 6: R Band Light Curve with Type II-P Supernova
Going forward, we would like to continue taking V and
R images of SN 2019hyk using remote telescopes to prolong
our light curve. If possible, we would also like to take
B images of SN 2019hyk in order to obtain more accurate
magnitude information. Later on, we will move on to other
newly-discovered supernovae and study their spectra
and light curves. Our data, along with the supernova data
from observers worldwide, will help astronomers to gain
a better understanding of supernova physics and the star
forming process.
Acknowledgements
Thank you to the following people for assisting us with
our research on SN 2019hyk: Dr. Michael Faison, Michael
Warrener, Ava Polzin, Trustin Henderson, and Imad Pasha.
References
Figure 7: Spectrum of SN 2019hyk with Hα Emission Line
Several of our images were taken on days with poor seeing
and light cloud cover, which increased the noise in our images.
The images taken on July 14th are poorly focused. Additionally,
our images from July 27th were taken with the STT1603ME CCD
camera instead of our usual STL1001E CCD camera. For extraneous
reasons, we could not take flats for these images. This further
increased the noise present in our images and increased our uncertainties.
[1] Stanek, K.~Z. 2019, Transient Name Server Discovery
Report 2019-1053
[2] Seligman, 2019, Index Catalog Objects: IC 4350 -
4399
[3] Virtual Telescope, 2019, Supernova SN 2019hyk in
the Spiral Galaxy IC-4397
[4] Weizmann, 2019, SN 2019hyk Transient Name Server
[5] Stevenson D. S, 2014, Extreme Explosions Supernovae,
Hypernovae, Magnetars, and Other Unusual Cosmic
Blasts, Springer, New York
15
On Method
Exo
Vigh
SkyShot Autumn 2020
16
SkyShot Autumn 2020
s of Discovering
planets
nesh Nagpal
Artist concept of the TRAPPIST-1 planetary system. Source: NASA, 2018 17
SkyShot Autumn 2020
We are currently in the midst of a golden age of planet
discovery, a burst that has come along quite rapidly. Even
as little as twenty-five years ago, the question of whether
there existed planets orbiting other stars like our Sun
remained open. However, in 1995, a team led by Michel
Mayor and Didier Queloz at the University of Geneva
achieved the first definitive detection of an exoplanet
in orbit around a main-sequence star: 51 Pegasi B. In
the years since, exoplanet science has come leaps and
bounds. Through the efforts of scientists worldwide and
large-scale endeavours such as the Kepler Space Telescope,
we now know of over 4000 confirmed exoplanets
in more than 3000 unique stellar systems—and we’ve
barely scratched the surface. But how on Earth do we
find these planets?
Today, many methods for discovering exoplanets
exist, of which two have experienced the most success:
transits and radial velocities. The first of these, the transit
method, works by monitoring the brightness of stars
over time. A planet passing between us and a star will
block a small fraction of its star’s light, which will manifest
a dip in the star’s brightness as we observe it. Observing
such dips in a star’s lightcurve is a telltale piece
of evidence for the presence of planets orbiting it. The
transit method is most sensitive to short period planets
with large radii, since we are more likely to both catch
such planets in a transit and detect the resultant dip in
the star’s lightcurve. Despite this method’s limitations
with regards to detecting longer period planets, missions
such as the Kepler Space Telescope have phenomenally
exploited this technique. Kepler alone managed to find
2662 exoplanets, more than half of all those currently
known!
The second of these techniques is called the radial velocity
method. Whenever two bodies form a gravitationally
bound system, each settles into orbit around their common
centre of mass (COM). In most planetary systems,
the host star’s mass far outweighs that of any planets that
may orbit it, causing the COM’s location to lie very close
to the star. In our case, the COM of the Earth-Sun system
lies within the Sun’s outer layers! As a consequence, our
orbital motion around the COM is much more apparent
than the Sun’s, which is perhaps best described as a periodic
wobble.
While such wobbles are invisible to the naked eye and
usually too small to cause a perceptible difference in a
star’s position on an image, we can detect them by examining
a star’s spectrum. Applying the Doppler Effect to
spectral lines (Figure 1) makes it possible to measure the
star’s radial velocity, which is the component of a star’s
velocity pointing directly along our line of sight. Periodic
trends in a star’s radial velocity (RV) can indicate the presence
of an additional object—whether that be a planet
or stellar companion—that causes the observed motion.
These trends allow us to, through their gravitational influence,
detect planets or faint stellar companions. It is
then possible to use the data contained within the radial
velocity trend to investigate the nature of the companion’s
orbit—a procedure known as orbit fitting. Orbit fitting
using radial velocities can provide useful information on
a companion’s orbital period, eccentricity and even mass.
However, since RVs only tell us about the component of
18
Figure 1: Doppler shift observable due to the motion of orbiting bodies.
SkyShot Autumn 2020
the star’s motion aligned with our line of sight, orbit fitting
using RVs alone suffers from a degeneracy between planet
mass (M) and orbital inclination (i), the angle by which the
companion’s orbit is tilted relative to our line of sight. Thus,
it is impossible to determine planet masses using radial velocities
alone. What we can compute is the product of the
planet mass and its orbital inclination: M(sin(i)). Much like
the transit method, the radial velocity method works best
for massive, short period planets as these planets generate
a higher amplitude radial velocity signal. Additionally, short
periods make it possible to precisely fit the planet’s orbit and
then subsequently confirm model predictions using future
measurements. However, unlike transits, radial velocities
maintain their usefulness even when applied to searches for
planets located further out from their home star, since radial
velocity measurements taken over a long baseline make it
possible to study longer term trends. Still, extracting information
about the nature of planets with orbital periods of
a few decades (like Neptune and Uranus, the outer gas giants
of the solar system) remains difficult. It is in this regime
that a technique that may initially seem the most obvious of
them all can help: directly imaging the exoplanet.
Image 1
When considering techniques that could be used to discover
exoplanets, a natural thought to come to mind is the
idea of simply imaging the system in question to look for
planets directly. In practice, however, this sort of direct imaging
remains highly difficult. Stars are usually much brighter
than their planets, a property which results in them simply
drowning out the light we receive from any planets that
may lie in orbit. However, the last few years have seen great
progress in direct imaging endeavors, an exciting result of
which is the image of HR 8799 (Image 1). In the image, the
black circle in the centre is the result of a technique known
as coronagraphy, which blocks the light from the central
star, allowing the light from the four planets in orbit to be
more amenable to detection. The planet visible closest in is
thought to have an orbit of around 45 years, while the planet
furthest out orbits around once every 460 years. These are
the sort of long period planets that transits and radial velocities
struggle with. Direct imaging, on the other hand,
thrives for exactly these kinds of planets!
When we image stars using telescopes, the nature of how
their light spreads out on the detector is determined by
its point spread function (PSF). The brightness of the PSF
drops off with distance from the centre of the light distribution.
Thus, in trying to image planets widely separated
from their host stars, we have to contend with less of the
star’s contaminating light. This makes long period planets
most amenable to direct imaging. The ability of direct imaging
to look for long period planets complements the effectiveness
of transits and radial velocities at characterising
shorter period planets! Furthermore, direct imaging can be
used synergistically with radial velocities to determine the
actual masses of planets rather than just M(sin(i))! The additional
information provided by jointly using radial velocities
and direct imaging can help us pin down the orbits
and physical properties of long period planets to a much
greater precision. While direct imaging is still in its relative
infancy as a technique, the rapidly advancing nature of the
field as well as the planned capabilities of next generation
missions, such as the Nancy Roman Space Telescope, paint
a bright picture for the future of exoplanet science. By
making exoplanets at wide separations amenable for study
and characterisation, direct imaging when used together
with other techniques such as radial velocities, will allow
us to gain a better picture of the population of planets that
exist in our galaxy.
Citations
[1] [@NobelPrize]. (2019, October 8). The method used
by research groups to find a planet is called the radial velocity
method; it measures the movement of the host
star as it is affected by the gravity of its planet. Retrieved
from Twitter at https://twitter.com/NobelPrize/status/1181550417588768768/photo/1.
[3] Kaufman, M. (2017, January 26). A four-planet system
in orbit, directly imaged and remarkable. NASA Exoplanet
Exploration. https://exoplanets.nasa.gov/news/1404/afour-planet-system-in-orbit-directly-imaged-and-remarkable/.
19
SkyShot Autumn 2020
Astrophotography on the
Cheap: A Guide
Cameron Woo
If you’re like me, you started in normal DSLR photography,
and were drawn into astrophotography by seeing
beautiful landscape shots of the Milky Way and wanted
to do it on your own. Well you probably can’t do those
because light pollution is everywhere (especially on the
east coast of the US). But that doesn’t mean you can’t get
beautiful astrophotography images. So here’s a guide for
astrophotography with just a camera, lens, and tripod.
Equipment
People like to say that it’s not the tool, but how you
use it. But that’s only true to an extent. For astrophotography,
you need to collect a lot of light on your sensor
(because these objects are dim). So you will need a camera
that allows you to shoot long exposures with manual
control. If you’re going from normal photography
to astrophotography, that likely isn’t an issue. It means
using a DSLR or mirrorless camera. (Or a dedicated astronomy
camera, but if you’re already using that then
you probably don’t need this guide).
You will also need a tripod. This is pretty non-negotiable
because you need a stable surface to mount your
camera onto. Any shakes (even the slightest vibrations
from pressing the shutter button) WILL make your image
shaky and give you streaking stars.
You can use whatever lens you want. For beginners,
typically wider and faster is better. Faster, meaning a
lower f-stop or wider aperture. This allows the camera
to collect more light in less time. A wider lens also allows
you to take longer exposures with less noticeable
star trailing. There are online calculators, but I use the
500 rule to estimate an exposure length, then adjust
based on my images. The 500 rule states that the maximum
exposure time one should use is equal to 500/focal
length.
For “cheap” this equipment may run you around $300
for a camera, $100 for a tripod, and $300 for a lens, if you
buy all new. So the used/refurbished market is a good
place. However, many people even attempting this likely
already have a camera and lens and tripod. If you’re
fresh in, you should probably definitely get a star tracker,
but that’s a different tutorial.
Imaging
The key to getting good images is to take many photos
of the same object and stack them in a program like
Deep Sky Stacker (in Windows) or SiriL (in MacOS).
There are four types of images (aka frames) that you
should take: lights, darks, flats, and bias/offset frames.
Note that all these images should be taken in a RAW file
format with noise reduction turned off. RAW allows
us to get every piece of data that the camera collects,
unmodified. Noise reduction is achieved through collecting
lots of data and calibration frames, so the camera’s
internal noise reduction is unnecessary and possibly
harmful to your final image.
Where to Shoot/Bortle Scale
Although this guide is meant for people living in
heavily light polluted skies, that doesn’t mean you can’t
use similar methods if you are lucky enough to drive to
darker skies. The most common way to determine how
dark your sky is is by using the Bortle scale. The scale
ranges from 1-9, with 9 being the most light polluted,
city sky at 1 being a truly dark sky, far far away from light
pollution. There are many light pollution maps that can
give you an idea for your Bortle scale, such as darksitefinder.com.
These can also help you find any close areas
that may be slightly better. Apps like Clear Outside can
give you an actual number, but it can sometimes be inaccurate.
It tells me my town is a Bortle 6, but the town
right over is a Bortle 8 - and a mile difference isn’t going
to drop your class by 2 levels, so be wary. The best way
to assess your conditions is to go outside and look up! In
the end, it won’t matter if the app says you live in Bortle
3 skies if your town just installed fancy, new, bright, blue
LED street lamps in your cul de sac.
Lights
These are actual images of the object. For these, you
20
SkyShot Autumn 2020
use as wide an aperture as you can, a slow
shutter speed, and a high ISO. There are,
of course, downsides to these decisions.
A wide aperture can lead to distorted,
T shaped stars in the corners of the image.
These can be avoided by stopping
down the lens, though you sacrifice how
much light you collect. It’s up to you to
play around and find the balance. If you
need light, you can always crop the corners
away.
A slow shutter speed may introduce
star trails, which we absolutely don’t
want. So you must find a balance, using
the 500 rule.
High ISO creates a very noisy image.
cap on. These must be shot at the same
ISO, shutter speed, as your lights and
when the sensor is at the same temperature.
This means they should be taken
in between lights. You only need about
20 dark frames, though it never hurts to
take more. They must be taken at every
imaging session. These dark frames create
a base noise pattern that will be removed
from your stacked image.
Flats
Flat frames are flat white pictures
that help remove lens distortion like vignetting.
They must be taken with the
same lens and aperture settings as your
These are similar to dark frames and
are meant to remove the base noise pattern
present inherently in the sensor.
These frames should be taken at the
fastest shutter speed possible, and at the
same ISO as your lights. Aperture doesn’t
matter. Take 10-20 bias frames.
And that’s all the frames you need. Image
stacking and processing is a whole
other tutorial, so I’m not going to mention
that here.
When and What to Image
Living in light pollution is a real
bummer, and may discourage you from
shooting (especially when you take a
photo and see a bright grey sky show up
in your photo), but we can still make the
most of the night sky. Apps like Clear
Outside and websites like Clear Dark Sky
will let you see detailed cloud cover and
seeing/transparency conditions, which
can drastically change the quality of
your shots. Shooting high at the zenith
(straight up) will let you shoot through
the least amount of light pollution and
atmosphere, so pick targets high in the
sky. Also, choose large, bright targetsthe
Orion Nebula is a great example. It’s
bright and easy to find in the sword of
Orion. And remember that a full moon
will produce a TON of light pollution, so
schedule your shooting around the new
moon. Also, use planetarium software
like Stellarium to plan your shots and
explore new targets.
Good luck and clear skies!
An example of a Stellarium view.
Preferably, we’d use a lower ISO, but
since we aren’t using a star tracker we
need a high ISO. Once again, find a balance
between light and noise, but know
that much noise will be removed in processing.
The more lights you have, the more
images you have to play around with and
stack. So take as many lights as possible.
In Bortle 8 skies, I like to take at least 70.
Darks
These are images, but with the lens
lights. One popular method is the “white
T-shirt method”, where you take a white
T-shirt, stretch it over the lens, and point
it at the sky at dusk or dawn. We want to
make the frame as evenly lit as possible,
so the sky is a nice, large, diffused light
source. Take about 10-20 flats. These
frames are most easily taken in aperture
priority mode. This way you know you’re
collecting enough light.
Bias
21
SkyShot Autumn 2020
Tulip Nebula (Sh2-101)
© Ryan Caputo
The Horsehead Nebula (Barnard 33)
© Wilson Zheng
22
The Orion Nebula (Messier 42)
© Cameron Woo
SkyShot Autumn 2020
Milky Way Galaxy over Etscorn Campus Observatory at New Mexico Tech
© Owen Mitchell
The Moon
© Nathan Sunbury
The Pleiades (Messier 45)
© Cameron Woo
23
SkyShot Autumn 2020
The Sunflower Galaxy (Messier 63)
© Wilson Zheng
The Ring Nebula (Messier 57)
© Nathan Sunbury
The Orion Nebula (Messier 42) and Running Man Nebula (Sh2-279)
© Jonah Rolfness
24
SkyShot Autumn 2020
Milky Way Galaxy, as seen from Kaanapali in Maui
© Cameron Woo
The Pinwheel Galaxy (Messier 101)
© Jonah Rolfness
Sadr Region
© Jonah Rolfness
25
SkyShot Autumn 2020
The Summer Triangle Asterism
(Deneb-Cygnus, Vega-Lyra, Altair-Aquila)
© Cameron Woo
The Hercules Cluster (Abell 2151)
© Ryan Caputo
Star Trails Over Pierson College
© Anavi Uppal
26
SkyShot Autumn 2020
Messier 3 (near Bootes)
© Wilson Zheng
The Heart Nebula (IC 1805) - Fish Head Nebula (IC 1795) and Melotte 15 Mosaic
© Jonah Rolfness
27
SkyShot Autumn 2020
Removing Noise and Improving
the Quality of Astronomical
Observations with Fourier
Transforms
Ezgi Zeren
Dated: 15 September 2019
Abstract
In this research, the quality of astronomical images was improved by eliminating the imperfections possibly
caused by sky conditions or disturbances to a telescope. Using Fourier series and transforms, the images
were processed in MATLAB to remove clouds, blur, smear, and vignetting. After Fourier-transforming the
original images, a filter was multiplied with the Fourier transform of the images. Then, the inverse
Fourier transform process was performed to obtain the filtered images. A high-pass sharp cut-off filter was
used to emphasize the edges of astronomical images and to get rid of blur. In order to remove clouds, smear,
and vignetting, a high-pass Gaussian filter was applied to the images. The resultant filtered images
suggested an improvement in the image quality and displayed more distinguishable celestial objects.
PACS numbers: 02.30.Nw, 06.30.−k, 07.90.+c, 95.85.−e
Keywords: Astronomy, Fourier transforms, high-pass Gaussian filter, high-pass sharp cut-off filter, MATLAB
28
I. Introduction
High-quality astronomical images are essential to the
discovery of our universe. Astronomers try to make their
images as clear and accurate as possible by using lossless
file formats and processing the images. For more than
four decades, astronomers have been using the Flexible
Image Transport System (FITS) to interchange as much
data as possible with lossless compression. FITS was so
successful that even scientists started using it in digitizing
manuscripts and medical imaging [1]. After taking images
with a professional telescope, astronomers use image
processing to improve the quality of their images and the
accuracy of their measurements.
Even though image reduction is a well-developed technique
to increase image quality, there are obstacles that
sometimes prevent astronomers from obtaining the images
they need, such as sky conditions and other disturbances
from the environment. It may not be possible to
change the sky conditions at a certain night or prevent
the vibrations coming from the floor, but images taken
at these circumstances can be improved to an extent that
they are available for use in procedures that need precise
measurements such as the orbit determination of celestial
objects.
The data used in this research includes images taken
with a 20-inch CCD reflecting telescope in the Sommers-Bausch
Observatory at the University of Colorado
Boulder as well as other images from astronomers around
the world. The images included light cloud coverage, blur,
smear, and vignetting. The impurities in the images were
eliminated by applying a high-pass Gaussian filter or a
high-pass sharp cut-off filter created in MATLAB, using
Fourier series and transforms. Since the impurities were
light, they produced a low-frequency noise, which made
it possible to eliminate them without damaging the necessary
data.
II. Data Acquisition
A. Observations and Image Reduction
The image of Saturn, FIG. 9, was obtained using the
PlaneWave Telescopes in the Sommers-Bausch Observatory
at the University of Colorado Boulder [2]. The
observatory was located at a longitude of 105.2630 W, a
SkyShot Autumn 2020
Every function can be expressed as a waveform that is
composed of sines and cosines. Since humans became very
successful at horology [10], the study of time measurelatitude
of 40.0037 N, and an altitude of 1653 meters. The two
20-inch (508 mm) telescopes, Artemis and Apollo, were CDK20
Corrected Dall-Kirkham Astrograph carbon-fiber truss telescopes
and had a focal length of 3454 mm and a focal ratio of
f/6.8. Without any off-axis coma, astigmatism, and field curvature,
the CDK20 telescopes had a 52 mm field of view. They
also used CCD software for imaging.
A Bahtinov Mask [3], a device that is used for achieving a
high-level accuracy when focusing, was attached to focus the
telescope during observations, and the telescope was slewed to
a star of magnitude 2 to 4. After focusing as precisely as possible,
the Bahtinov Mask was removed, and the telescope was
slewed to Saturn. Then, a test image was taken to ensure that
there is no issue with focusing and that the telescope is slewed
to the correct right ascension and declination. In order to do
image reduction, three types of images were taken with the
telescopes: light frames, flat frames, and dark frames [4].
Dark frames were the images that contain no light, and
they were used to eliminate the effect of having different
signal readings from the camera sensors. FIG. 3 shows an
example of a dark frame. The exposure time for the images
was 70 seconds in all of the observations, since it was
enough to have a high-quality image.
Figure 3: An example dark frame.
The image reduction was done with AstroImageJ, an
open source image processing software. The dark frames
were subtracted from the light frames and the subtracted
image was divided by a normalized version of the flat
frames. This process made the reduced images of Saturn
almost free of all imperfections that are caused by the telescopes
and the imaging CCD arrays. Thus, the only problem
was the blur caused by the resolution of the telescopes.
Figure 1: An example light frame.
Light frames were the frames that contained the image of the
celestial bodies in the sky. If it is not possible to perform image
reduction, astronomers only use the light frames as their data,
because the only necessary data about the celestial objects is
stored in the light frames. Other frames are only used to further
improve the quality of the images by eliminating possible imperfections
caused by the equipment. FIG. 1 shows an example
of a light frame, an image of stars in the sky.
Flat frames were the frames that only contain an image of a
white wall, and that were used to eliminate the optical imperfections
such as the effect of having dust on the sensors. FIG. 2
shows an example of a flat frame.
Figure 2: An
example flat
frame.
B. Data from Other Astronomers
The image of the Eagle Nebula, the Pillars of Creation,
FIG.4, was taken by Professor Peter Coppinger [5], Associate
Professor of Biology at Rose-Hulman Institute of Technology,
who is an amatuer astrophotograher.
The image with the light cloud coverage, FIG. 14, was
taken by Rob Pettengil [6] with the Sony RX100V camera
at ISO 400, with a focal ratio of f/2 and a focal length of 24
mm. The exposure time was 2 minutes.
The image with smearing, FIG. 24, was taken by Mike
Dodd, Louise Kurylo, Michelle Dodd and Miranda Dodd at
the Hidden Creek Observatory [7] with a Astro-Tech AT-
130EDT refractor, which has an aperture of 130 mm (5.12”),
a focal length of 910 mm, and a focal ratio of f/7.
The image with vignetting, FIG.19, was taken by Jerry
Lodriguss [8] with a 12.5 inch (317.5 mm) Dobsonian Telescope
that has a focal ratio of f/6 and a focal length of 75
inches (1905 mm).
III. Methods
A. The Fourier Transform [9]
29
SkyShot Autumn 2020
ment, scientists have been trying to use frequencies in their
experiments for better accuracy. The Fourier Transform is
a useful tool that describes a given waveform as the sum of
sinusoids of different frequencies, amplitudes, and phases.
If a function Φ(p) has a Fourier Transform F(x), then Φ(p) is
also the Fourier Transform of F(x). This “Fourier Pair” , Φ(p)
and F(x) is defined as
C. The Two-Dimensional DFT
In this research, images will be Fourier transformed.
Thus, a two-dimensional Discrete Fourier Transform is
done with the FFT algorithm.
Instead of the formulae for the one-dimensional case,
Eq. (3) and Eq. (4), the two-dimensional DFT has the formulae
B. The Fast Fourier Transform Algorithm (FFT) and
the Discrete Fourier Transform (DFT) [11]
The Discrete Fourier Transform (DFT) is the mutual transformation
of a pair of sets, [an] and [Am], such that each contain
N elements. The formulae for DFT are
where m and n are the rows and columns of g[m,n], and
j and k are the rows and columns of G[j,k] columns. The
Fourier Transform Matrix G[j,k] is obtained after Fourier
transforming the data matrix g[m,n]. Then, a filter can
be applied to G[j,k], and the filtered g[m,n] matrix can be
found by the inverse DFT synthesis process, where the input
matrix is the filtered G[j,k] as in Eq. (6).
D. Using MATLAB for Fourier Transforming FITS,
png, and jpg Images
If
, the formulae for DFT can be set out as a
matrix operation as follows.
As shown in section III. A., the Fourier transformation
process involves integral calculus which can cause a problem
when the integrand is an experimental data set instead of an
analytical function. The Fast Fourier Transform algorithm
(FFT) was discovered to deal with complex integrands like
these. This algorithm reads the data into an array and returns
the transformed data points.
The matrix operation above requires N2 multiplications.
However, the fast Fourier transform (FFT) reduces
the number of multiplications necessary from N 2 to about
N/2log2(N).
As mentioned in section I, FITS images were specifically
tailored for Astronomy, and they have a special but expanding
usage amongst the scientific community. Therefore,
the image processing functions in the Image Processing
Toolbox of MATLAB, are not capable of processing FITS
files. However, MATLAB has a special function to read
FITS files, called fitsread() [12]. After reading the FITS files,
with fitsread(), the imagesc() function was used to show
the image because imagesc() displays the image with a
scaled data. Then, the colormap was set to be gray rather
than the default color, blue, for aesthetic purposes.
For png and jpg images, imread() function was used to
read the files. Then, the images were converted to gray
scale with rgb2gray() function. For displaying the gray image,
imshow() function was used.
30
SkyShot Autumn 2020
MATLAB has its own two-dimensional Fast Fourier Transform
function called fft2(). This function, along with the fftshift() function
was used to place the low frequencies in the center of the
image. Since it was difficult to visualize complex numbers, the
magnitudes were displayed as a matrix. Also, having the sum of
the values in the center might have caused other magnitudes to
be dwarfed when compared to the center. This was prevented by
taking the logarithm of the values. The logarithm was taken in
the form of log(1 + matrix element value) to avoid the possibility
of having negative results when the matrix element value is
less than 1. The subroutine fftshow.m, written by Alasdair McAndrew
of Victoria University of Technology, was used to display the
two-dimensional DFT. The subroutine can be found in the appendix.
The command impixelinfo displays pixel information such as
the pixel value, which is helpful to check that nothing goes wrong
with the Fourier transform.
E. Edge Detection of the Images of Nebulae Using MATLAB
The identification of a nebula against the sky can be challenging
when the image taken has a significant amount of noise, which is
usually caused by sky conditions such as air pollution. There is
noise in the image of the Eagle Nebula, the Pillars of Creation, FIG.
4, which makes it difficult to see the nebula’s original shape. The
image also contains a slight amount of vignetting, which causes
the image to lose details in the corners and edges of the picture.
Figure 5: Fourier transform of the Eagle Nebula.
The high-pass sharp cut-off filter has the pixel value of
0 in the center and the pixel value of 1 everywhere else, in
order to block the low frequencies and only pass the high
frequencies when multiplied with the Fourier transform.
The high-pass sharp cut-off filter was created in the shape
of a circle in order to block the desired low frequencies in
two dimensions. Since the filter was multiplied with the
Fourier transform, the size of the meshgrid was set to be
same with the original image, 541x542. The equation of
a circle was used to create a circle, and the radius of the
circle was set to be 0.2 pixels because after trying several
radii, 0.2 was determined to be functioning well for edge
detection. The code for creating the filter is below.
Since the radius of the filter was 0.2 pixels, the filter appears
to be the shape of a square instead of a circle, and
the filter is hard to see in the filtered Fourier transform,
FIG. 7; however, 0.2 pixels was enough for edge detection.
Figure 4: Original image of the Eagle Nebula.
After performing the Fourier transform with the code in section
III. D., FIG. 5 was obtained. The image of the Fourier transform
shows the low frequencies in the center and high frequencies
when moving away from the center. Since the noise has a low
frequency, a high pass sharp cut-off filter was used for edge-detection.
Figure 6: The high-pass sharp cut-off filter for the
nebula.
31
SkyShot Autumn 2020
The filter and the Fourier transform were multiplied in order
to eliminate the low frequencies and emphasize the high
frequencies. The filtered Fourier transform, FIG. 7, was set
to grayscale since it was making it easier to compare the filtered
Fourier transform from the original Fourier transform.
After the application of the filter, the edges of the Eagle
Nebula became clearer to the observer, meaning that the
high-pass sharp cut-off filter was successful at performing
edge detection on the image of the nebula.
F. Edge Detection of the Rings of Saturn Using
MATLAB
Even after processing the astronomical images with image
reduction and using the highest quality way to store
the images with FITS files, astronomical images might not
be clear enough. There is always a limit of resolution that
a certain telescope can reach. The image of Saturn, FIG. 9,
was reduced with the method described in section II. A.
However, the rings of Saturn were not clear enough because
of the restriction of the limits of resolution of the
telescope [13].
Figure 7: The filtered Fourier transform of the nebula.
Finally, the inverse Fourier transform was applied to the
filtered Fourier transform, FIG. 7, using the ifft2() function
of MATLAB. The ifft2() function is for two dimensional inversion
Fourier transformation, which is necessary to finalize
the filtering of the images used in this research.
Figure 9: The original image of Saturn.
Since the image of Saturn was stored in a FITS file, the
code in section III.D. for FITS images was used, and the
Fourier transform was performed in the same way described
in section III.F.
Figure 8 displays the image of the Eagle Nebula after the
high-pass edge detection filter was applied.
Figure 8: The filtered image of the nebula.
Figure 10: The Fourier transform of Saturn.
32
SkyShot Autumn 2020
Since the determination of the rings of Saturn requires edge
detection, a high-pass sharp cut-off filter was used for the image
of Saturn as well. However, the original image size and the radius
of the filter were much bigger.
Figure 13: The filtered image of Saturn.
E. Edge Detection of the Images of Nebulae Using
MATLAB
Figure 11: The high-pass sharp cut-off filter for Saturn.
The high-pass filter was multiplied with the Fourier transform
to produce the filtered Fourier transform in the same way described
in section III.F. Since the radius of the filter was 20 pixels,
its circular shape was more distinct this time, when compared
with the high-pass filter used for the Eagle Nebula.
Since the light cloud coverage and the blur caused by
the sky conditions in astronomical images have a from
similar to that of a Gaussian function, they can be eliminated
with a Gaussian high-pass filter [14]. FIG. 14 displays
a light cloud coverage which makes it hard to determine
the stars in the sky.
Figure 14: Original image of the light cloud coverage.
The image was read and Fourier-transformed with the
code displayed in section III.D.
Figure 12: The filtered Fourier transform of Saturn.
The Fourier-transformed image of the Eagle Nebula after the
high-pass edge detection filter is applied is displayed in FIG. 13.
After the application of the sharp cut-off filter to the image of
Saturn, the rings of Saturn became more recognizable. The rings
got darker while the background got brighter, making the edges of
the rings very clear.
Figure 15: The Fourier transform of the cloud coverage.
33
SkyShot Autumn 2020
The Gaussian function, Eq. (7), creates a Gaussian filter
with standard deviation σ. The standard deviation value determines
the strength of the filter, as the standard deviation
value gets higher, the filter becomes more distributed, and
when the value gets lower, the filter becomes more dense.
In order to create the high-pass Gaussian filter, the discretization
[15] of the Gaussian function was subtracted
from 1, Eq. (8), making the pixel values in the center equal to
0. The MATLAB code for Eq. (8) is provided below.
Figure 17: The filtered Fourier transform of the cloud
coverage.
Figure 18 is the Fourier-transformed image of the light
cloud coverage after the high-pass Gaussian filter is applied.
The inverse Fourier transformation process was performed
as described in section III. E.
Figure 18: The filtered image with cloud coverage.
Figure 16: The high-pass Gaussian filter for
cloud coverage.
Figure 16 shows the zoomed in version of the high-pass
Gaussian filter. The smooth edges of the Gaussian filter seen
in FIG.16. are very useful for eliminating noise. In addition,
using a Gaussian filter prevents the ringing effect, a wavy
appearance near the edges of images due to the loss of high
frequency information. With images with cloud coverage,
not having a ringing effect is crucial, since the ringing effect
might also ruin the images.
As can be seen from the filtered image, FIG. 18, the
clouds are not visible to the observer anymore. This suggests
that the cloud removal with the high-pass Gaussian
filter was successful. Moreover, after the removal of the
clouds, the stars became more apparent, and their shape
became more circular, since the noise around the stars was
also eliminated.
H. Getting Rid of Vignetting in Astronomical Images
Using MATLAB
Vignetting is caused by the design of the sensor and
the lens of a camera; therefore, it is unavoidable in every
image. The light coming towards the center of a camera
sensor strikes at a right angle, but the angle of incidence
decreases further from the center of the censor, which
34
SkyShot Autumn 2020
causes vignetting. However, vignetting can be present in different
amounts in different images which makes its effects nearly invisible
in some images.
The image of the sky, Figure 19, has a visible amount of vignetting
in the corners and edges, and a slight noise in the center, which
ruins the clarity of the image. Since it is light enough, the noise in
the center can be eliminated, using a high-pass Gaussian filter. After
eliminating the noise, the sky will be more visible and the uneven
distribution of the light due to vignetting will be eliminated.
Figure 19: The sky image with vignetting.
The image was Fourier transformed as described in section III.
D. As can be seen from the Fourier transform of the image of the
sky, the low frequencies in the center of the image have a small radius.
Thus, a Gaussian filter with 2.5 standard deviation was used
to eliminate the noise.
Figure 21: The high-pass Gaussian filter for vignetting.
Since the standard deviation of the Gaussian filter was
small, the filter appears to be like a black point in the center
of the filtered Fourier transform of the image with vignetting,
FIG. 22. Even though it seems to be too small to
have a significant effect in the image, the filter cancels out
enough of the noise in the image.
Figure 22: The filtered Fourier transform of the image
with vignetting.
Figure 20: The Fourier transform of the image with
vignetting.
Figure 23 displays the Fourier-transformed image of the
Sky with vignetting after the high-pass Gaussian filter was
applied. The inverse Fourier transformation process was
performed as described in section III. E.
The resultant filtered image no longer has uneven
brightness distribution, and the edges of the image do not
have a darker color than the center of the image. Since the
noise in the center of the image was eliminated and the
effects of vignetting was canceled out, the application of
the high-pass Gaussian filter was successful.
35
SkyShot Autumn 2020
The image with smearing was Fourier transformed as
described in section III.D. The Fourier transform of the
image with smearing has a small radius of distinguishable
low frequencies in the center, but since the smearing was
harsher than a slight noise, a Gaussian filter with a relatively
higher standard deviation was used.
Figure 23: The filtered image with vignetting.
I. Eliminating Smearing in Astronomical Images Using
MATLAB
Smearing in astronomical images is usually caused by factors
independent of the telescope. If a telescope is disturbed
during its exposure time, smearing will probably occur. This
can be prevented by removing any disturbances near the
telescope. However, the vibrations from the floor can be an
unavoidable factor that causes smearing. In situations like
this, the images can be recovered by applying a high-pass
Gaussian filter to treat the smear as a noise around the actual
celestial objects.
Figure 24 shows the image of stars with smearing. Instead
of being circular, the stars seem to look like ellipses. This
image can be recovered with a high-pass Gaussian filter because
the smeared parts of the stars have a lower frequency
than the actual parts of the stars. If the smear was as light as
the stars, then recovering with a filter might not be possible.
Figure 25: The Fourier transform of the sky image with
smearing.
The high-pass Gaussian filter used for the smearing,
FIG. 26, has a standard deviation of 8, which cancels out
a more distinct amount of noise from the original image.
Figure 26: The high-pass Gaussian filter for smearing.
Figure 24: The sky image with smearing.
As can be seen from the filtered Fourier transform of
the smear, Figure 27, the high-pass Gaussian filter used for
smearing is very visible in the center.
36
SkyShot Autumn 2020
Figure 27: The filtered Fourier transform of the smeared
image.
FIG. 28 displays the image with smearing after the high-pass
edge detection filter is applied, and the image is Fourier-transformed.
The inverse Fourier transformation process was performed
as described in section III. E.
Since the background noise from the sky lessened clearly, the
high-pass Gaussian filter was successful at removing the noise
around the images of stars with smearing. In addition, the shape
of the stars became more circular, when compared to their initial
smeary shape. Thus, the high-pass Gaussian filter improved the
quality of the image.
Figure 28: The filtered sky image with smearing.
IV. Conclusion
In this research, astronomical images with blur, smearing, light
cloud coverage, and vignetting were processed with MATLAB.
Image processing was done with Fourier series and transforms by
creating high-pass Gaussian and sharp cut-off filters. After Fourier-transforming
the images and shifting the zero-frequency components
of the discrete Fourier transforms to the center, the noise
in the center was multiplied with the filters and eliminated.
In order to emphasize the edges of the Eagle Nebula
and to make the rings of the Saturn more clear, high-pass
sharp cut-off filters with different radii were created. The
radii of the filters were determined by trial and error.
High-pass Gaussian filters were used to cancel out the
effects of vignetting, smearing, and blur. Since the noise
created by these factors was similar to a Gaussian noise,
the high-pass Gaussian filter was successful at eliminating
a portion of the noise, making the images more clear. A
high-pass Gaussian filter was also used to cancel out the
light cloud coverage in an image of the sky. The Gaussian
filter was suitable for eliminating the clouds because the
clouds had a low frequency and blurry edges. The highpass
Gaussian filters used in different images had different
standard deviations. The best standard deviation value for
an image was again determined by trial and error. Since
the Fourier transform of the image with light cloud coverage
had more low frequencies in the center than others,
the highest value of standard deviation was used to eliminate
the light cloud coverage.
After multiplying the Fourier transforms of the images
with the filters created in MATLAB, the inverse Fourier
transform process was performed in order to obtain the
filtered image.
After the application of the high-pass sharp cut-off filter,
the original image of the Eagle Nebula, the Pillars of
Creation, Figure 4, became more evident. As can be seen
from the filtered image of the Eagle Nebula, Figure 8, the
edges of the nebula are more emphasized, and the stars
are more distinguishable from the background.
The rings of Saturn in the original image, Figure 9, were
blurry before being filtered. However, after the high-pass
sharp cut-off filter was applied, the rings appear in black
around Saturn. The rings are more recognizable in the filtered
image of Saturn, Figure 13.
The image with light cloud coverage, Figure 14, was
covered with noise, and the stars were difficult to see. After
using a high-pass Gaussian filter, the clouds, which
were light enough to be treated like a Gaussian noise, were
eliminated. As can be seen from the filtered image of the
sky, Figure 18, the stars are more distinguishable from the
background after the clouds are removed.
The sky image with vignetting, Figure 19, has uneven
brightness distribution due to vignetting. The effect of
vignetting was eliminated with a high-pass Gaussian filter.
The resultant filtered image, Figure 23, no longer has
uneven brightness distribution, and the stars are more
recognizable.
With the application of the high-pass Gaussian filter,
the original sky image with smearing, Figure 24, looks less
noisy. The stars in the filtered sky image with smearing,
Figure 24, are more circular, and their edges are more distinct.
This research proposed a method for the removal of
clouds, blur, smearing, and vignetting from astronomical
images with Fourier series and transforms, using MAT-
LAB. Different amounts of noise was eliminated from dif-
37
SkyShot Autumn 2020
ferent images, since the success of the filtering process also
depended on how much noise was present in the original
image. When compared with the original images, the resultant
filtered astronomical images suggest that the filters
worked successfully to improve the quality of the astronomical
data.
Acknowledgements
This research was guided by the Pioneer Academics Program.
Special thanks to Professor Arthur Western for always
being a source of help and advice.
Appendix: The fftshow() Subroutine
gnet.html
[9] Ronald N. Bracewell, The Fourier Transform and its
Applications, International Editions, 3rd Ed., McGraw-Hill
[10] M.of Time, Mathematics, https://www.encyclopedia.com/science-and-technology/mathematics/mathematics/measurement-time
[11] J.F. James, A Student’s Guide to Fourier Transforms:
with Applications in Physics and Engineering (Cambridge
University Press, Cambridge, 1995)
[12] A. Hayes and J. Bell, http://hosting.astro.cornell.
edu/academics/courses/astro3310/Matlabimages.html
[13] Lea, S. M. & Kellar, L. A., An algorithm to smooth
and find objects in astronomical images, (Astronomical
Journal, 1989), pp. 1238-1246
[14] Hummel, R. A., Kimia, B., and Zucker, S. W., “Deblurring
Gaussian blur. Computer Vision, Graphics, and
Image Processing,” 66–80 (1987)
[15] A. Torralba, http://6.869.csail.mit.edu/fa16/lecture/
lecture3linearfilters.pdf
References
[1] Thomas, “Learning from FITS: Limitations in use in
modern astronomical research,” Astronomy and Computing
12, 133-145 (2015)
[2] Sommers-Bausch Observatory (2018)
[3] R. Berry and J. Burnell, The Handbook of Astronomical
Image Processing (Willmann-Bell, Richmond, VA, 2011)
[4] W. Romanishin, An Introduction to Astronomical
Photometry Using CCDs, (University of Oklahoma, 2006),
pp. 79-80
[5] P. Coppinger, https://www.rose-hulman.edu/academics/faculty/coppinger-jpeter-coppinge.html
[6] R.Pettengil, http://astronomy.robpettengill.org/
blog180114.html
[7] M. Dodd, http://astronomy.mdodd.com/flexure.html
[8] J. Lodriguss, http://www.astropix.com/html/jdigit/vi-
38
SkyShot Autumn 2020
Computational Astrophysics
Advancements of 2020
Priti Rangnekar
Depiction of lower error for the D3M model compared to the earlier 2LPT model [2].
Traditionally, the words “astronomy” and “astrophysics”
may conjure images of ancient star charts, telescopes staring
into the night sky, or chalkboards filled with Einstein’s
equations detailing special and general relativity. However,
with the rise of ground and space-based sky survey
projects and Citizen Science endeavors involving contributions
from amateur astronomers worldwide, the field
of astronomy is becoming increasingly data-driven and
computationally enhanced. Survey projects, such as The
Large Synoptic Survey Telescope, bring data issues such as
high volume (nearly 200 petabytes of data), large varieties
of data, and rapid speeds of data production and transmission,
requiring efficient analysis through statistical computing.
[1] As we collect more information about the physical
world and develop powerful software and hardware,
we gain the ability to methodically find patterns and make
large scale predictions based on what we do know, allowing
us to embrace the frontier of what has always been unknown.
In June 2019, researchers from institutions including
Carnegie Mellon University and the Flatiron Institute announced
the development of the first artificial intelligence
simulation of the universe - the Deep Density Displacement
Model. With the ability to complete a simulation in
less than 30 milliseconds, the model proved to be both efficient
and accurate, with relative errors of less than 10%
when compared with both accurate but slow models and
fast but less accurate models. Moreover, it could provide
accurate values for certain physical values, such as dark
matter amount, even when tested with parameters, such
as gravitational conditions, it was not originally trained
on. This is just one example of how the power of computing
techniques can allow us to better understand the universe
and its past. [2]
In 2020, research groups from around the world have
further capitalized on artificial intelligence and supercomputing
to analyze specific aspects of the universe, including
exoplanets, galaxies, hypernovae, and neutron star
mergers.
Gaussian Process Classifiers for Exoplanet Validation
University of Warwick scientists Armstrong, Gamper,
and Damoulas recently capitalized on the power of machine
learning to develop a novel algorithm for confirming
the existence of exoplanets, which are planets that orbit
stars outside the Solar System. [3]
Traditionally, exoplanet surveys use large amounts of
telescope data and attempt to find evidence of an exoplanet
transit, or any sign of the planet passing between the telescope
and the star it is orbiting. This typically comes in the
form of a dip in the observed brightness of the target star,
which makes intuitive sense given that the planet would
be obstructing some light. Nevertheless, this analysis can
be prone to false positive errors, given that an observed dip
does not necessarily indicate the presence of an exoplanet;
it could also be caused by camera errors, background object
interference, or binary star systems.[3] In the case of
39
SkyShot Autumn 2020
40
a binary star system, eclipsing binaries may result, in
which a star’s brightness would vary periodically as one
passes in front of the other, causing the observed dip.
Such a phenomenon would require extended analysis
of the target star’s flux lightcurve, which shows changes
in brightness. In the case of background object interference,
a background eclipsing binary or planet
may blend with the target star, requiring researchers
to observe any offset between the target star and the
transit signal. [4]
As a result, researchers use a planetary validation
process in order to provide the statistical probability
that a transit arose from a false positive, in which
a planet was not present. [5] A common algorithm
used for validating some of the approximately 4,000
known exoplanets has been the vespa algorithm and
open source code library. The procedure, detailed in a
paper by Morton in 2012, accounts for factors such as
features of the signal, target star, follow-up observations,
and assumptions regarding field stars. [6] However,
as Armstrong, Gamper, and Damoulas explain in
their abstract published in August 2020, a catalogue
of known exoplanets should not be dependent on
one method. [5] Previous machine learning strategies
have often generated rankings for potential candidates
based on their relative likelihoods of truly being planets;
however, these approaches have not provided exact
probabilities for any given candidate. For example, in
2017, Shallue and Vanderburg developed a model that
generated rankings for potential candidates based on
their relative likelihoods of truly being planets. 98.8%
of the time, plausible planet signals in the test set were
ranked higher than false positive signals. [7]
However, a probabilistic framework is a key component
of the planetary validation process. Thus, by employing
a Gaussian Process Classifier along with other
models, the University of Warwick researchers could
find the exact statistical probability that a specific exoplanet
candidate is a false positive, not merely a relative
ranking. In general, a Gaussian Process generates
a probabilistic prediction, which allows researchers to
incorporate prior knowledge, potentially find confidence
intervals and uncertainty values, and make decisions
about refitting. [8] If the probability of a candidate
being a false positive is less than 1%, it would
be considered a validated planet by their approach.
Trained using two samples of confirmed planets and
positive samples from Kepler, the model was tested on
unconfirmed Kepler candidates and confirmed 50 new
planets with a wide range of sizes and orbital periods.
[3]
Although the computational complexity for training
the model is higher than that of traditional methods,
and certain discrepancies with vespa were found, this
approach demonstrates a clear potential for efficient
automated techniques to be applied for the classification
of future exoplanet candidates, while becoming
more accurate with each dataset due to machine learning.
In fact, the researchers aim to apply this technique
to data from the missions PLATO and TESS, which has
already identified over 2,000 potential exoplanet candidates.
[9]
Machine Learning and Deep Learning for
Galaxy Identification and Classification
Another area of artificial intelligence growing in
popularity is image classification and object detection,
with common applications for autonomous vehicles
and medical imaging. A powerful technique in this field
is a convolutional neural network, a form of deep learning
roughly based on the functionalities and structure
of the human brain. Each layer of the network serves
A depiction of an exoplanet
transit lightcurve;
the Gaussian Process
Classifier prioritizes
the ingress and egress
regions, indicated by
the 2 dotted lines, when
classifying exoplanets
[5].
An example of data
augmentation for galaxy
images using rotation
and flipping [10].
SkyShot Autumn 2020
a unique purpose, such as convolution
layers for generating feature maps from
the image, pooling layers for extracting
key features such as edges, dense layers
for combining features, and dropout layers
that prevent overfitting to the training
set. [10]
This method was applied to galaxy
classification by researchers at the National
Astronomical Observatory of Japan
(NAOJ). The Subaru Telescope, an
8.2-meter optical-infrared telescope at
Maunakea, Hawaii, serves as a robust
source of data and images of galaxies
due to its wide coverage, high resolution,
and high sensitivity. [11] In fact, earlier
this year, astronomers used Subaru Telescope
data to train an algorithm to learn
theoretical galaxy colors and search for
specific spectroscopic signatures, or
light frequency combinations. The algorithm
was used to identify galaxies in the
early stage of formation from data containing
over 40 million objects. Through
this study, a relatively young galaxy HSC
J1631+4426, breaking the previous record
for lowest oxygen abundance, was discovered.
[12]
In addition, NAOJ researchers have
been able to detect nearly 560,000 galaxies
in the images and have had access
to big data from the Subaru/Hyper
Suprime-Cam (HSC) Survey, which
contains deeper band images and has
a higher spatial resolution than images
from the Sloan Digital Sky Survey. Using
a convolutional neural network (CNN)
with 14 layers, they could classify galaxies
as either non-spirals, Z-spirals, or
S-spirals. [10]
This application presents several important
takeaways for computational
astrophysics. The first is the augmentation
of data in the training set. Since
the number of non-spiral galaxies was
significantly greater than the number of
spiral galaxies, the researchers needed
more training set images for Z-spiral and
S-spiral galaxies. In order to achieve this
result without actively acquiring new
images from scratch, they flipped, rotated,
and rescaled the existing images with
Z-spiral and S-spiral galaxies, generating
a training set with roughly similar numbers
for all types of galaxies.
Second, it is also important to note
that the accuracy levels of AI models may
reduce when working with celestial bodies
or phenomena that are rare, due to a
reduction in the size of the training set.
The galaxy classification CNN originally
achieved an accuracy of 97.5%, identifying
spirals in over 76,000 galaxies in
a testing dataset. However, this value
decreased to only 90% when the model
was trained on a set with fewer than 100
images per galaxy type, demonstrating
the potential for concerns if more rare
galaxy types were to be used.
A final important takeaway is regarding
the impact of misclassification and
differences between the training dataset
and the testing dataset. When applying
the model to the testing set of galaxy images
to classify, the model found roughly
equal numbers of S-spirals and Z-spirals.
This contrasted with the training set, in
which S-spiral galaxies were more common.
Although this may appear concerning,
as one would expect the distribution
of galaxy types to remain consistent, the
training set may have not been representative,
likely due to human selection
and visual inspection bias. In addition,
the authors point out that the criterion
of what constitutes a clear spiral is ambiguous,
and that the training set images
were classified by human eye. As a result,
while the training set only included images
that had unambiguous spirals; the
validation set may have included more
ambiguous cases, causing the model to
incorrectly classify them.
Several strategies can be used to combat
such issues in scientific machine
learning research. In terms of datasets,
possible options include creating a new,
larger training sample or employing numerical
simulations to create mock images.
On the other hand, a completely
different machine learning approach -
unsupervised learning - could be used.
Unsupervised learning would not require
humans to visually classify the
training dataset, as the learning model
would identify patterns and create classes
on its own. [10]
In fact, researchers at the Computational
Astrophysics Research Group at
the University of Santa Cruz have taken
a very similar approach to the task of
galaxy classification, focusing on galaxy
morphologies, such as amorphous elliptical
or spheroidal. Their deep learning
framework, named Morpheus, takes in
image data by astronomers and uniquely
does pixel level classification for various
features of the image, allowing it to
discern unique objects within the same
image rather than merely classifying the
image as a whole (like the models used
by the NAOJ researchers). A notable benefit
of this approach is that Morpheus
can discover galaxies by itself and would
not require as much visual inspection or
human involvement, which can be fairly
high for traditional deep learning approaches
- the NAOJ researchers worked
with a dataset that required nearly
100,000 volunteers. [13] This is crucial,
given that Morpehus could be used to
analyze very large surveys, such as the
Legacy Survey of Space and Time, which
would capture over 800 panoramic images
per night. [13]
Examples of a Hubble Space Telescope
Image and its classification results
using Morpheus [13].
41
SkyShot Autumn 2020
Supercomputing for Analyzing
Hypernovae and Neutron Star
Mergers
Given the data-intensive nature of
this endeavor as well as the need for intensive
pixel-level classification, it is natural
to wonder how scientists are able to
run such algorithms and programs in
the first place. The answer often lies in
supercomputing, or high performance
computing (HPC). Often Supercomputers
often involve interconnected nodes
that can communicate, use a technique
called parallel processing to solve multiple
computational problems via multiple
CPUs or GPUs, and can rapidly input and
output data. [14] This makes them prime
candidates for mathematical modeling
of complex systems, data mining and
analysis, and performing operations on
matrices and vectors, which are ubiquitous
when using computing to solve
problems in physics and astronomy. [15]
The robust nature of supercomputing
was recently seen, as researchers
from the Academia Sinica’s Institute of
Astronomy and Astrophysics used the
supercomputer at the NAOJ to simulate
a hypernova, which is potentially
100 times more energetic than a supernova,
resulting from the collapse of a
highly massive star. The program simulated
timescales nearly an order of magnitude
higher than earlier simulations,
requiring significantly higher amounts
of computational power while allowing
researchers to analyze the exploding star
300 days after the start of the explosion.
[16] However, this was indeed beneficial,
as the longer timescale enabled assessment
of the decay of nickel-56. This
element is created in large amounts by
pair-instability supernovae (in which no
neutron star or black hole is left behind)
and is responsible for the visible light
that enables us to observe supernovae.
Moreover, we cannot underestimate the
importance of simulations, as astronomers
cannot rely on observations given
the rarity of hypernovae in the real
world. [17]
Supercomputers have also been used
for simulating collisions between 2
neutron stars of significantly different
masses, revealing that electromagnetic
radiation can result in addition to gravitational
waves. [18] Once again, we can
see the usefulness of computational simulations
when real observations do not
suffice. In 2019, LIGO researchers detected
a neutron star merger with 2 unequal
masses but were unable to detect
any signal of electromagnetic radiation.
Now, with the simulated signature, astronomers
may be capable of detecting
paired signals that indicate unequal neutron
star mergers. In order to conduct
the simulations using the Bridges and
Comet platforms, researchers used nearly
500 computing cores and 100 times
as much memory as typical astrophysics
simulations due to the number of physical
quantities involved. [19] Despite the
tremendous need for speed, flexibility,
and memory, supercomputers prove an
essential tool in modeling the intricacies
of our multifaceted universe.
A 3-D visualization of a pair-instability
supernova, in which nickel-56 decays in
the orange area [17].
ATERUI II, the 1005-node Cray XC50
system for supercomputing at the Center
for Computational Astrophysics at
the NAOJ [16].
Conclusion
Undoubtedly, scientific discovery is at
the essence of humankind, as our curiosity
drives us to better understand and
adapt to the natural and physical world
we live in. In order to access scientific
discovery, we must have the necessary
tools, especially as the questions we ask
are becoming more complex and data is
becoming more ubiquitous. Outer space
continues to feature so many questions
left to answer, yet with profound implications
for humankind. The overarching,
large-scale nature of the physical processes
that govern celestial bodies begs
for further research and analysis to learn
more about unknown parts of the universe.
Yet, we are now better equipped
than ever to tackle these questions. We
can find trends in the seemingly unpredictable
and using logic, algorithms,
and data through computer programs,
creating a toolbox of methods that can
revolutionize astronomy and astrophysics
research. Ultimately, as we strive to
construct a world view of how the universe
functions, we will be able to make
the most of large portions of data from
a variety of research institutions while
fostering collaboration and connected
efforts by citizens, scientists, and governments
worldwide.
Citations
[1] Zhang, Y., & Zhao, Y. (2015). Astronomy
in the Big Data Era. Data Science
Journal, 14(0), 11. doi:10.5334/dsj-2015-011
[2] Sumner, T. (2019, June 26). The first
AI universe sim is fast and accurate-and
its creators don’t know how it works.
Retrieved November 25, 2020, from
https://phys.org/news/2019-06-ai-universe-sim-fast-accurateand.html
[3] Armstrong, D. J., Gamper, J., & Damoulas,
T. (2020). Exoplanet Validation
with Machine Learning: 50 new validated
Kepler planets. Monthly Notices
of the Royal Astronomical Society.
doi:10.1093/mnras/staa2498
[4] S. T. Bryson, M. Abdul-Masih, N.
Batalha, C. Burke, D. Caldwell, K. Colon,
J. Coughlin, G. Esquerdo, M. Haas,
C. Henze, D. Huber, D. Latham, T. Morton,
G. Romine, J. Rowe, S. Thompson,
A. Wolfgang, 2015, The Kepler Certified
42
SkyShot Autumn 2020
False Positive Table, KSCI-19093-003
[5] Staff, S. (2020, August 25). 50
new planets confirmed in machine
learning first. Retrieved November
25, 2020, from https://phys.org/
news/2020-08-planets-machine.html
[6] Morton, T. D. (2012). AN EFFI-
CIENT AUTOMATED VALIDATION
PROCEDURE FOR EXOPLANET
TRANSIT CANDIDATES. The Astrophysical
Journal, 761(1), 6. https://doi.
org/10.1088/0004-637x/761/1/6
[7] Shallue, C. J., & Vanderburg, A.
(2018). Identifying Exoplanets with
Deep Learning: A Five-planet Resonant
Chain around Kepler-80 and
an Eighth Planet around Kepler-90.
The Astronomical Journal, 155(2), 94.
https://doi.org/10.3847/1538-3881/
aa9e09
[8] 1.7. Gaussian Processes — scikitlearn
0.23.2 documentation. (2020).
Scikit-Learn.Org. https://scikit-learn.
org/stable/modules/gaussian_process.html
[9] Yeung, J., & Center/NASA, D.
(2020, August 26). Artificial intelligence
identifies 50 new planets
from old NASA data. Retrieved November
25, 2020, from https://news.
lee.net/news/science/artificial-intelligence-identifies-50-new-planets-from-old-nasa-data/article_556fdd68-e7ad-11ea-85cb-87deec2aa462.html
[10] Tadaki, K.-, Iye, M., Fukumoto,
H., Hayashi, M., Rusu, C. E., Shimakawa,
R., & Tosaki, T. (2020). Spin
parity of spiral galaxies II: a catalogue
of 80 k spiral galaxies using big data
from the Subaru Hyper Suprime-Cam
survey and deep learning. Monthly
Notices of the Royal Astronomical
Society, 496(4), 4276–4286. https://
doi.org/10.1093/mnras/staa1880
[11] Overview of Subaru Telescope:
About the Subaru Telescope: Subaru
Telescope. (n.d.). Retrieved November
25, 2020, from https://subarutelescope.org/en/about/
[12] Kojima, T., Ouchi, M., Rauch,
M., Ono, Y., Nakajima, K., Isobe, Y.,
Fujimoto, S., Harikane, Y., Hashimoto,
T., Hayashi, M., Komiyama, Y., Kusakabe,
H., Kim, J. H., Lee, C.-H., Mukae,
S., Nagao, T., Onodera, M., Shibuya,
T., Sugahara, Y., … Yabe, K. (2020). Extremely
Metal-poor Representatives Explored
by the Subaru Survey (EMPRESS).
I. A Successful Machine-learning Selection
of Metal-poor Galaxies and the Discovery
of a Galaxy with M* < 106 M ⊙
and 0.016 Z ⊙. The Astrophysical Journal,
898(2), 142. https://doi.org/10.3847/1538-
4357/aba047
[13] Stephens, T. (2020). Powerful new
AI technique detects and classifies galaxies
in astronomy image data. Retrieved
November 25, 2020, from https://news.
ucsc.edu/2020/05/morpheus.html
[14] Hosch, W. L. (2019, November 28).
Supercomputer. Retrieved November 25,
2020, from http://www.britannica.com/
technology/supercomputer
[15] HPC Basics Series: What is Supercomputing?
(2019, March 11). Retrieved
November 25, 2020, from http://www.
nimbix.net/what-is-supercomputing
[16] Peckham, O. (2020, July 24). Supercomputer
Simulations Delve Into
Ultra-Powerful Hypernovae. Retrieved
November 25, 2020, from http://www.
hpcwire.com/2020/07/23/supercomputer-simulations-delve-into-ultra-powerful-hypernovae/
[17] Gough, E. (2020, July 21). Supercomputer
Simulation Shows a Supernova
300 Days After it Explodes. Retrieved
November 25, 2020, from http://www.
universetoday.com/147096/supercomputer-simulation-shows-a-supernova-300-days-after-it-explodes.
[18] C., H. (2020, September 25). Scientists
May Have Developed New
Way to Detect ‘Invisible’ Black Holes.
Retrieved November 25, 2020, from
http://www.sciencetimes.com/articles/26727/20200803/scientists-developed-new-way-detect-invsible-blackholes.htm
[19] Penn State. (2020, August 3).
Unequal neutron-star mergers create
unique ‘bang’ in simulations. Science-
Daily. Retrieved November 24, 2020
from www.sciencedaily.com/releases/2020/08/200803184201.htm
43
SkyShot Autumn 2020
Advancements in Aerospace
Rutvik Marathe
Rocket science: just about one of the easiest subjects in the
world. While we see launches becoming commonplace today,
this wasn’t at all the case just about 100 years ago. That’s right
- the venture of spaceflight is a very new one, requiring the
most precise and powerful technologies we have ever made. It
is far from easy; there are dozens of hurdles to overcome and
situations to account for. The most notable of these challenges
is due to Earth’s gravitational field, as lifting a rocket off the
ground and sustaining its flight needs a lot of fuel. So much so,
that roughly 80-90% of it is just fuel, preventing us from actually
carrying much into space. If that wasn’t enough, the more
fuel you carry, the more additional fuel you have to bring for
that original fuel. However, even with challenging problems
like these, society has made a lot of recent progress in launching
things into space. From companies like SpaceX, Boeing,
and Lockheed Martin, to government organizations like the
Indian and Chinese space agencies, we have been overcoming
the massive challenges that spaceflight presents by launching
every few weeks. So how did we get to this point?
Modern rocketry began to develop in the late 1800s
and early 1900s. The development of aviation led to the first
attempts to launch things off the ground, and using fuel
propulsion came soon after. At this time, all flight was limited
only to the Earth’s atmosphere. But these initial steps helped
establish how rockets function (expelling something downwards
to go up), and the process of enhancing them could
start soon. From that point, early attempts to launch things
into space were made. The first successful space launch came
during the Cold War, when the Soviet Union launched the
satellite Sputnik 1. This was huge news! A country had finally
succeeded in putting something in orbit around the Earth.
And although the R7 rocket that launched it was not very
powerful compared to modern standards, it was a monumental
development in rocketry.
Given the previously mentioned challenges, like the high
fuel requirements for rockets, meant that it wasn’t as easy as
“just making a bigger rocket” to launch heavier things. That
approach would mean that we would need gigantic rockets -
much bigger than the ones we use today - to launch people and
supplies into space. It was clear that we had to make advancements
in the way the rocket was constructed and launched,
rather than simply keep the current system, but make it larger.
Significant advancements came during the time of the space
shuttle, in the 1980s. New materials were being tested for structures
like the fuel tank, most notably an aluminum lithium alloy.
This new material reduced the rocket’s weight by about
20%, making launching and escaping Earth’s gravitational well
easier, and allowing for greater payloads for missions. Another
big advancement in this era was made in reusability. At first,
rockets were designed to be one-time-use only, as recovery was
too complicated a process to attempt early on. NASA’s Space
Shuttle was the first breakthrough in this field, as it captured
and reused the shuttle on many missions. Additionally, SpaceX
has been a pioneer in creating reusable rocket boosters, which
fly back to Earth and land on a platform in the middle of the
ocean. Such technologies make space missions much cheaper
and allow them to run quicker, as you don’t need to invest time
and energy into remaking these parts of the rocket.
Even with the current growth rate of technology, conventional
propulsion (burning fuel like we do today) doesn’t seem
like a long-term option if we want to expand past the Earth.
Such movement beyond our own planet would need a high frequency
of space missions, and therefore a lot of fuel. Not only
is this too costly for any groups to carry out, it is also not a re-
44
SkyShot Autumn 2020
NASA’s Evolutionary Xenon Thruster Project’s 7-kilowatt
ion thruster. Source: NASA.gov.
Illustration of spacecraft powered by nuclear thermal propulsion.
Source: NASA/Marshall.
newable energy source, so there is only a certain amount
of it available for use [1]. The resource-heavy launch process
right now could be improved through alternative
propulsion methods. These methods, if made efficient,
could be the next big advancement which would allow us
to travel to our solar system and beyond.
One of these is electric propulsion - using electrical
energy to shoot ions out of the rocket and making the
rocket go forward via Newton’s third law. While the tiny
mass of ions means that the thrusters produce very low
acceleration, electricity would not be very hard to gather
and mass produce. In fact, solar panels on a rocket could
even “collect” fuel as the mission progresses! This type of
clean energy for propulsion is being heavily researched
and tested as a major source of fuel.
Another approach being looked at is nuclear propulsion,
where nuclear reactions in a rocket will burn hydrogen
out of the end of the rocket, propelling it forward [2].
This technology is also being developed right now as a
possible “use” of all the nuclear bombs sitting idle in underground
bunkers. Many agree that they would be put
to better use in a rocket engine than being inactive (and
potential apocalypse devices!).
While these technologies are promising for the future,
they don’t seem to be as powerful as the traditional methods
of burning fuel. It is very likely that in the future,
rockets use hybrid varieties of propulsion for different
scenarios. For example, they could use traditional fuels
for initial speed to leave the Earth’s surface, but electric
or nuclear propulsion to navigate through space.
There are many advancements that have been made in
rocketry, and many more to come. Rocketry is catching
speed as many private companies have joined the arena,
competing to make the cheapest and most efficient rockets
possible. For the next few decades, advancements in
this field will only continue to grow, fueled by the uniting
goal of expanding humanity past Earth!
Citations
[1] Bruno, T. (n.d.). The Properties of RP-1 and RP-2
MIPR F1SBAA8022G001. https://kinetics.nist.gov/Real-
Fuels/macccr/macccr2008/Bruno2.pdf
[2] 6 Things You Should Know About Nuclear Thermal
Propulsion. (2020b, January 21). Energy.Gov. https://
www.energy.gov/ne/articles/6-things-you-should-knowabout-nuclear-thermal-propulsion
45
SkyShot Autumn 2020
Access to Space
Carter Moyer
It was almost ten years ago when the Space Shuttle Atlantis
touched down for the last time, bringing a close to the thirty-year-long
program and the United States’ ability to send
humans into space. Ever since, NASA and ESA have relied on
the fourth generation Roscosmos’ tried and true Soyuz rocket—one
that was far cheaper, older, and safer than the American-made
shuttle. The past two Presidential administrations
have pushed for that to change, initiating the commercial crew
program and calling for the United States to once again partake
in human spaceflight, and this time under the veil of neoliberal
capitalism.
The program has largely supported space-industry-veteran
Boeing’s Starliner and astronautical-startup SpaceX’s Crew
Dragon programs, but delay after delay[1], cost overrun after
overrun[2], and an inflight safety test failure[3] has cast a shadow
on the program. And yet, in the middle of a global pandemic
and mass mobilizations against systemic racism and
anti-Blackness, something miraculous happened: SpaceX, not
Boeing, safely ferried two astronauts to and from the International
Space Station. There might have been a little booster
landing thrown in there as well.
This year has been full of space-related surprises and accomplishments,
actually. NASA’s Mars 2020 Rover, now named
Perseverance[4], and paired with its copter buddy, Ingenuity[5],
are on their way to the red planet. This rover is uniquely
poised to continue the goal of searching for life and surveying
Mars—the lightweight aerial drone will be able to go to areas
that Perseverance cannot, either due to geography or the sheer
time it would take to get to them. The rover will also be leaving
behind cached regolith samples[6] in the hopes that future
missions will be able to gather and study them.
And the United States is far from the only active participant
in space this year. China is hoping to launch its Chang’e-5
mission[7] later this year, carrying a lander that will collect
and then return lunar regolith samples to Earth, the first of
its kind since the Soviet Union’s Luna 24 mission. The United
Arab Emirates even had a Mars mission of its own, Hope[8],
launched aboard a Japanese Mitsubishi rocket.
All of these missions are, whether wholly or significantly,
supported by governments with many also focusing on scientific
discovery. There is also, undoubtedly, a large degree of national
pride intertwined with these missions, but a motive that
is relatively absent is profit.
The 2010s have been the launchpad for the 2020s’ space
boom as nation states and multinationals alike pour money
into fleeing a planet literally and metaphorically on fire. Governments
will continue to launch scientific, exploratory, and,
yes, vanity missions, but what we are increasingly seeing is the
private sector taking up the monetization of space. It’s not a
new concept nor is it an under-discussed one, but it is starting
to come to fruition, specifically with SpaceX’s Starlink program[9].
There are currently over five hundred SpaceX satellites
in low Earth orbit, primed to offer internet connectivity
to the public.
With thousands of more satellites planned and the telecommunications
industry handling trillions of dollars every year,
SpaceX is primed to make a lot of money[10] once these satellite
constellations are operational. And with SpaceX also operating
one of the only ways to get these satellites into LEO, the
only other private corporations that could compete with them
would be a pact between Amazon and Blue Origin. Enter Jeff
Bezos’ Kuiper constellation[11].
This seems to be a much more immediate, much more profitable
way to monetize space compared to the oft-lauded space
tourism industry which can only cater to a small number of
high-net-worth individuals and has no room for error[12].
Such an industry shift is also poised to redefine, or at the very
least close the gap between, the roles of governments and corporations
in space. For the longest time, only governments
could fund and take on the risk of space exploration. It’s why
the only consistent customers for sending people to the ISS
are governments. Yet, many people are aware of Elon Musk’s
plan to send people to Mars—it is the main mission of SpaceX,
its prime directive. To do so would be inordinately risky and
costly. Much like how Amazon Web Services is able to subsidize
the rest of Amazon, however, Starlink may very well be
the key ingredient[13] in paving the way for Elon Musk’s billionaire
space fantasies to become reality. The same applies to
Jeff Bezos, Blue Origin, and Amazon.
It’s far from the democratization of space once promised,
but this decade will determine whether the keys to space remain
exclusively in the hands of governments or are shared
with the megarich.
Citations
[1] Wattles, J. (2019, November 16). Boeing and SpaceX face
‘significant’ challenges in delayed NASA program. https://
www.cnn.com/2019/11/16/tech/spacex-boeing-nasa-oig-scn/
index.html.
[2] Smith-Schoenwalder, C. (2019, June 20). GAO: NASA
Programs Rack Up Delays, Cost Overruns. U.S. News & World
Report. https://www.usnews.com/news/national-news/articles/2019-06-20/gao-nasa-programs-rack-up-delays-costoverruns.
[3] Sheetz, M. (2020, March 6). NASA investigation finds
61 corrective actions for Boeing after failed Starliner spacecraft
mission. CNBC. https://www.cnbc.com/2020/03/06/
46
SkyShot Autumn 2020
nasa-finds-61-corrective-actions-for-boeings-starliner-spacecraft.html.
[4] Potter, S. (2020, July 30). NASA, ULA Launch Mars
2020 Perseverance Rover Mission to Red Planet. NASA.
https://www.nasa.gov/press-release/nasa-ula-launchmars-2020-perseverance-rover-mission-to-red-planet.
[5] Northon, K. (2018, May 11). Mars Helicopter to Fly
on NASA’s Next Red Planet Rover Mission. NASA. https://
www.nasa.gov/press-release/mars-helicopter-to-fly-onnasa-s-next-red-planet-rover-mission.
[6] Johnson, A., & Hautaluoma, G. (Eds.). (2020, June
17). The Extraordinary Sample-Gathering System of NA-
SA’s Perseverance Mars Rover – NASA’s Mars Exploration
Program. NASA. https://mars.nasa.gov/news/8682/the-extraordinary-sample-gathering-system-of-nasas-perseverance-mars-rover/.
[7] Jones, A. (2020, August 6). On its way to Mars, Chinese
spacecraft spots Earth and moon, aces steering maneuver.
Space.com. https://www.space.com/china-marsmission-spots-earth-and-moon.html.
[8] Bartels, M. (2020, July 19). United Arab Emirates
launches ‘Hope’ mission to Mars on Japanese rocket. Space.
com. https://www.space.com/hope-mars-mission-uaelaunch.html.
[9] Etherington, D. (2020, July 15). Leak reveals details
of SpaceX’s Starlink internet service beta program. Tech-
Crunch. https://techcrunch.com/2020/07/15/leak-revealsdetails-of-spacexs-starlink-internet-service-beta-program/.
[10] Sheetz, M. (2020, July 20). Morgan Stanley: SpaceX
could be a $175 billion company if Elon Musk’s Starlink
internet plan works. CNBC. https://www.cnbc.
com/2020/07/20/morgan-stanley-spacex-could-be-175-
billion-company-if-elon-musks-starlink-works.html.
[11] Grush, L. (2020, July 30). FCC approves Amazon’s internet-from-space
Kuiper constellation of 3,236 satellites. The
Verge. https://www.theverge.com/2020/7/30/21348768/
fcc-amazon-kuiper-satellite-constellation-approval.
[12] Australian Associated Press. (2018, December 17).
Richard Branson’s Virgin Galactic space flights criticised
as ‘dangerous, dead-end tech’. The Guardian. https://
www.theguardian.com/science/2018/dec/18/richard-bransons-virgin-galactic-space-flights-criticised-as-dangerousdead-end-tech.
[13] Sheetz, M. (2019, May 16). Elon Musk says SpaceX Starlink
internet satellites are key to funding his Mars vision.
CNBC. https://www.cnbc.com/2019/05/15/musk-on-starlink-internet-satellites-spacex-has-sufficient-capital.html.
Falcon 9 lifting off from the historic Launch Complex
39A, sending Crew Dragon to orbit on May 30, 2020. Source:
SpaceX.
47
SkyShot Autumn 2020
Figure 1: The Timeline of the Universe (NASA, 2006)
Understanding the Chronology of
the Universe, from the Big Bang
to the End of Time
Andrew Tran
Overview
Understanding the past and future of our universe is an idea
that cosmologists have worked on for several decades, tying
into several big-picture, philosophical questions, such as “Why
are we here?” or “What is the destiny of humanity in this vast
universe?” Using theoretical models, calculations, and observations,
physicists have been able to determine the stages and
conditions that the universe has experienced from the Big Bang
to today. From what astronomers have measured, it has also
been possible to predict how the universe will look hundreds
of billions of years into the future. After analyzing the densities
of baryonic matter and dark energy, it has become known that
the universe is expanding at an accelerated rate, and using this
information allows for calculated inferences about the behavior
of the universe throughout its chronology, going as far back
as 13.7 billion years ago.
The Moment of Creation
The first stage in the timeline goes back to about 13.7 billion
years ago, where it all began with the Big Bang. This moment
is often referred to as the ‘Planck epoch’ or the ‘Grand unification
epoch’, and marks a period of time that wasn’t even a
microsecond long [4]. When the universe was at this stage, all
of the four fundamental forces of nature, that being the three
in the Standard Model (strong nuclear, weak nuclear, and electromagnetic),
and gravity, were bonded together. The universe
was extremely high in temperature, at around 1030 degrees
Kelvin.
A common misconception with the Big Bang is that it was
an explosion that allowed the universe to exist à la Genesis,
when really it was more like all of space expanding violently at
once, increasing the distance between all of the structures in
the universe that would eventually become galaxies and stars.
The Big Bang truly marks the transition that the universe took
from being barely a few millimeters across, to the cosmic size
that we can see today [2]. It is often denoted as the ‘birth’ of our
universe because it’s where the fundamental ideas and laws of
physics that we know today, such as general relativity and quantum
mechanics, begin to work. This is where four fundamental
forces of physics, that being the gravitational, strong nuclear,
weak nuclear, and electromagnetic force began to break down,
and separate. We have been able to validate and justify the Big
Bang Theory, as it provides an explanation for many observations
we’ve made, such as the Cosmic Microwave Background
(CMB) and Hubble’s Law which indicates the expansion of the
universe.
48
SkyShot Autumn 2020
The Infant Universe
Next is the period of time when the universe was only a few
hundred thousand years old, just an infant compared to its age
today. At this point, the scale of the cosmos had already begun
to inflate. The tiny subatomic fluctuations within the fabric of
the universe at this stage are speculated to have been the seeds
for what would someday become galaxies.
Figure 2: Galaxies like
NGC 4414 formed
thanks to tiny quantum
fluctuations. (NASA/
Hubble, 1999)
During infancy, the universe began to form several kinds
of subatomic particles, which would someday be classified as
quarks, leptons, and gauge bosons [6]. From these subatomic
particles, a large amount of matter and antimatter were
formed, which annihilated one another whenever they interacted.
However, the amount of matter just slightly exceeded
the amount of antimatter, so that’s why today there’s only
mostly matter in the universe today (though, if it was antimatter
that was more abundant, we would have just ended up calling
that matter anyways).
About 1 second after the Big Bang, protons and neutrons
(the essential building blocks of atoms) formed, and at around
2 minutes, collided, creating heavier elements such as deuterium
[7]. For about 50,000 years, the universe was too high
in temperature for light to be able to travel, so it was just a
cloudy, blurry plasma permeating everywhere. Eventually, the
universe began to cool down, and began to be dictated by matter
instead of radiation, forming the first molecules [8].
them. The atoms no longer scatter light, so they can now travel
freely, illuminating the stage of the cosmos. Atoms that were
recently formed release photons that can still be detected today
in the cosmic microwave background radiation, which is the
furthest we can peer back in time into the cosmos—glimpses
of the leftover radiation emitted during this era, at the microwave
wavelength.
The Dark Ages
Unlike the Dark Ages following the fall of the Roman Empire,
the Dark Ages refer to a time in the universe, lasting nearly
a billion years, when the first stars and galaxies in the universe
had yet to shine. The cosmos were making the transition
from out of the “soup” of subatomic particles.
What made the universe so “dark” at this time was that the
light that could now travel freely was affected by the expansion
of the universe, stretching out or red-shifting into wavelengths
of light not in the visible spectrum. This darkness would end
up lasting hundreds of millions of years. During the Dark Ages,
the majority of the matter that occupied the universe included
dark matter, and neutral, uncharged amounts of hydrogen and
helium [3].
Figure 4:
An artistic
representation
of dark matter
(Shutterstock)
Eventually, the most ancient stars and galaxies began to
form, due to the accumulation of baryonic (ordinary) matter
and dark matter into disk-like structures. This point is commonly
referred to as the “Epoch of Reionization” [3]. Galaxy
clusters would begin to form, slowly transitioning the universe
out of the cosmic dark ages.
The Present Day (Galaxy Era)
Figure 3: The Cosmic Microwave Background Radiation
(NASA/WMAP, 2010)
Over 300,000 years later, with temperatures much lower
now, neutral atoms could be produced. This is the epoch
known as “recombination.” Ionized atoms were formed as
well, including hydrogen and helium, which are still the most
abundant elements in the universe today. As we reach the end
of the universe in its infancy, it starts to become transparent,
since the ionized atoms have attracted electrons, neutralizing
After the dark ages, we’re brought to the present day, often
referred to as the ‘galaxy era’ of the universe. Sometime into
this stage, the Milky Way, then our solar system, and then the
Earth entered the universe. And then just under 13.7 billion
years following the Big Bang, the human race walked the Earth
for the first time. If you were to scale down the entire history
of the universe from the Big Bang until today into one calendar
year, humans would have appeared just before midnight
on New Year’s Eve.
Figure 5: The
golden age of
our universe?
(NASA/Hubble,
2003)
49
SkyShot Autumn 2020
There is an estimated maximum of two trillion galaxies in
the observable universe. Given our observations of the incoming
light from galaxies, we have been able to conclude that the
universe is expanding at an accelerating rate. The more matter
and mass there is in an object, the more gravitationally attractive
it will be, so one would expect that the combined masses
of all the galaxies and groups of galaxies in the universe would
result in everything collapsing in on one another. Since this
isn’t the case, it means there is a mysterious force, which we
still don’t know much about, pushing everything apart: dark
energy. We have been able to conclude that dark energy makes
up 68% of everything in the universe, dark matter makes up
27%, and normal, baryonic matter to be barely 5% [5]. This
makes sense since the universe can only accelerate if the density
of matter is less than the density of dark energy.
If the universe is expanding at an accelerating rate, that
would mean that the galaxies are getting further apart from
one another. Eventually, humanity will see fewer and fewer
stars in the night sky. Our descendants, several thousands of
years into the future, may not get to enjoy astronomy and stargazing
as we get to today.
Eventually, the last stars in the universe will dim, maybe explode
in a supernova, but then eventually shut off for eternity.
This brings us to the last stage in our timeline of the universe.
What will be in store for existence as we know it?
The Future and Fate of Our Universe
The last stage of the universe timeline brings us to a point
where the stars and galaxies begin to stop forming. The universe
continues to expand at an accelerating rate, due to the
effects of dark energy. Given current models and data that we
have in cosmology, the most likely scenario that the universe
will experience is the “Big Freeze,” in which the universe will
keep expanding until it reaches a temperature of absolute zero.
Some other theories, such as the “Big Rip” and the “Big Crunch”
involve the universe going out in a spectacular and flashy way.
But the one that seems to be our destiny is cold and silent.
Eventually, once all of the stars have lived out their lives, all
that will be left in the universe are black holes, constantly feeding
on anything that gets near them, and maybe a few white
dwarfs [1]. It has been theorized that by this point protons will
decay as well. The beautiful cosmos that we once knew will become
a bizarre place mostly occupied by stellar corpses twisting
and turning spacetime. During this period, black holes may
merge together and release gravitational waves.
However, all things in the universe must come to an end,
and this includes black holes. Due to the phenomenon known
as Hawking radiation, black holes over time will evaporate as
a result of the quantum effects near the event horizon, the
boundary at the edge of the black hole, or “point of no return”
where nothing may escape [1]. Once the last black hole dies,
the universe will see a glimmer of light one last time, when the
last stellar remnant evaporates. Then, everything will go dark.
Life will be unable to thrive in this universe anymore. The concept
of time will become irrelevant.
Perhaps it won’t be all bad, though. The last survivors, which
may include humans, could find a way to escape this universe,
and go to an entirely different one. Physicists have for many
years postulated the idea of a multiverse, and if it’s true, then
life—humanity, could live on to see another day.
Citations
[1] Adams, Fred C.; Laughlin, Gregory (1997). “A dying universe:
the long-term fate and evolution of astrophysical objects”.
Reviews of Modern Physics. 69 (2): 337–372. arXiv:astro-ph/9701131.
Bibcode:1997RvMP...69..337A. doi:10.1103/
RevModPhys.69.337. S2CID 12173790.
[2] Bridge, Mark (Director) (30 July 2014). First Second of the
Big Bang. How The Universe Works. Silver Spring, MD. Science
Channel.
[3] Byrd, D. (2017, July 16). Peering toward the Cosmic Dark
Ages. EarthSky. https://earthsky.org/space/cosmic-dark-ages-lyman-alpha-galaxies-lager
[4] Chow, Tai L. (2008). Gravity, Black Holes, and the Very
Early Universe: An Introduction to General Relativity and Cosmology.
New York: Springer. ISBN 978-0-387-73629-7. LCCN
2007936678. OCLC 798281050.
[5] Dark Energy, Dark Matter. (n.d.). Retrieved November 26,
2020, from https://science.nasa.gov/astrophysics/focus-areas/
what-is-dark-energy
[6] First Light & Reionization - Webb/NASA. (n.d.). Retrieved
November 26, 2020, from https://jwst.nasa.gov/content/science/firstLight.html
[7] Kolb, Edward; Turner, Michael, eds. (1988). The Early
Universe. Frontiers in Physics. 70. Redwood City, CA: Addison-Wesley.
ISBN 978-0-201-11604-5. LCCN 87037440. OCLC
488800074.
[8] Ryden, Barbara Sue (2003). Introduction to Cosmology.
San Francisco: Addison-Wesley. ISBN 978-0-8053-8912-8.
LCCN 2002013176. OCLC 1087978842.
[9] WMAP Big Bang Theory. (n.d.). Retrieved November 26,
2020, from https://map.gsfc.nasa.gov/universe/bb_theory.
html
50
Figure 6: The universe will be dominated by these
stellar predators. (NASA/JPL, 2013)
SkyShot Autumn 2020
2002 KM6 (99795)
Naunet Leonhardes-Barboza
single points of excited light
sparkle the darkness
a chill breeze in the latest hours of the night
—blink, there she is flying
magnitudes brighter than her neighboring stars
she still stands out
a white dot among a sea of white dots
method of gauss
Naunet Leonhardes-Barboza
switch into gaussian time
we stare at the whiteboard, exhausted
perhaps, similar to carl once did
he looked at his desk
contemplating questions for the universe
there are no limits, and he wants to find ceres
ask your fellow intellect, like he did kepler
laugh and cry when it’s over
we now know where that small rock is going
51
SkyShot Autumn 2020
tessellated constellations
Alex Dong
gazing upon the expanses
of the great night sky
our endless wonder
at the mosaic sea of stars
is inexplicably bridled
by an intangible feeling that
the abyss separating
our world and distant realms
forms a chasm of black emptiness
meant to isolate different beings
are we imprisoned on this earth
destined to stay indefinitely
watching other worlds pass by
time and time again until
time can no longer describe
our passivity to their passing
through a window of obscurity
permitting only imagination to
penetrate its dark tinted display
of anonymous secrecy
and us the victims of our own nature
able to imagine but not to reach out
and satisfy our insatiable curiosity
we begin to realize the paradoxical
plight of human existence
as our seemingly powerful earthly
dwelling is humbled by our
temporary and insignificant
presence in the vastness of
the galaxy and the cosmos
52
so we are fated to be
mighty yet powerless
an existence woven in
a tessellated fabric of
captivating constellations and
starry specks mirroring
the very state of our planet
our world our reality
brimming with wonder and
glimmering with hope
SkyShot Autumn 2020
starry dreams
Alex Dong
stars
dance around in twilight
a few points on Nyx’s blanket
incomprehensible yet
sought to be understood
but neither wonder nor courage
can capture the essence of
their immortal shine
stars
their luminescence a perpetual comfort
their presence an eternal gaze yet
their longevity a reminder that
our time spent curiously probing is
just a second in their eon
their time is not ours and
still we look on
stars
on the edge of the galaxy
have already lived their lives
and faded into oblivion
our witness of their glory
an elegy to the past
time itself a warped cycle
of dreamy slumber and starry imagination
53
SkyShot Autumn 2020
images of the past
Naunet Leonhardes-Barboza
she can feed her own intellectual curiosity
from the fruit of the delectable tree of life
a red apple will fall upon her own head
so she, too, can discover something
she looks up at the night sky in awe
unaffected by her immediate environment
she, much like these glimmering dots
is not yet disillusioned
by the harsh realities of humanity
but there was a time she almost gave in,
she believed the stars were static
they would never attempt to deceive her
unlike the unforeseen friend or foe
the illusion of simplistic dots
the tempting twinkling as if saying “look at me closer”
yet, they were clouded by natural gases in Earth’s air
the stars can’t show her everything at once
unless with aid for her human eyes
she learned the truth about the lives of the stars
she took images of the past millions of years ago
she observed rainbows of color in the black void
and the intensity of rocks flying through space
she wondered if the stars would look back at her
and dismiss her life as a dot, dim with little hope
perhaps the stars and her feel the same way?
wishing to project the light of the present
and show each other their grand, dynamic journeys
54
SkyShot Autumn 2020
unseen skies
Alex Dong
they say
shoot for the moon
even if you miss
you’ll hit the stars
but some shots fall short
and the stars they never
witness our dreams
answer our prayers
our shots ring unheard
through the infinite darkness
the unknown chasm
oblivion
yet we shoot and we work
almost mindless in repetition
almost mechanical in movement
almost purposeless in routine
we become an echo
perpetuating the mechanism
that bounds us to its cycle
stifled in our own gasps for breath
our spirits dim and flicker
as gatsby’s green light winks out
our dreams can only remain
faraway fantasies forever
but through pensive nights we gaze up
to wisps of black and rays of dark
in search of the place the purpose
of our endless tunnel of toil
these dark skies yield no answer
yet somewhere within us we know
that honest truth lies far beyond
horizons we may perceive
so when darkness falls at the thirteenth hour
we shoot again hoping
one day
someday
we’ll reach the unseen skies
55
SkyShot Autumn 2020
Astrophotography Details
Ryan Caputo - Tulip Nebula (Sh2-101)
Dates: July 4, 2020, July 5, 2020, July 8, 2020
Imaging and Guiding:
Guan Sheng Optical Classical Cassegrain 6” F/12
Mount: iOptron CEM60
Imaging camera: ZWO ASI1600MM.
Guiding camera: ZWO ASI 290mm Mini
Editing Programs and Techniques:
Software: N.I.N.A , PixInsight 1.8 Ripely.
Filters: Radian Triad-Ultra
Ryan Caputo - Hercules Galaxy Cluster (Abell 2151)
Dates: April 28, 2020, May 18, 2020, May 29, 2020, May 30, 2020
Imaging and Guiding:
Guan Sheng Optical Classical Cassegrain 6” F/12
Mount: iOptron CEM60
Imaging camera: ZWO ASI1600MM.
Guiding camera: ZWO ASI 290mm Mini
Editing Programs and Techniques:
Software: N.I.N.A , PixInsight 1.8 Ripely.
Wilson Zheng - Horsehead Nebula (Barnard 33)
Date: March 27, 2020
Location: Dixon Observatory, Berkshire School, MA, USA
Equipment: Mead LX200 GPS 14” f/10
Camera Details: ZWO ASI1600MC
Acquisition Details: 27 @ 60 seconds
Editing Programs and Techniques:
Processed with PIXINSIGHT
Wilson Zheng - Messier 3 (near Bootes)
Date: April 16, 2020
Location: Dixon Observatory, Berkshire School, MA, USA
Equipment: Mead LX200 GPS 14” f/10
Camera Details: ZWO ASI1600MC
Acquisition Details: 240 @ 30 seconds
Editing Programs and Techniques:
Processed with PIXINSIGHT
Wilson Zheng - Sunflower Galaxy (Messier 63)
Date: March 27, 2020
Location: Dixon Observatory, Berkshire School, MA, USA
Equipment: Mead LX200 GPS 14” f/10
Camera Details: ZWO ASI1600MC
Acquisition Details: 360 @ 30 seconds
Editing Programs and Techniques:
Processed with PIXINSIGHT
Anavi Uppal - Star Trails over Pierson College
Date: August 26, 2020
Location: Pierson College at Yale University
Camera: Nikon D500
Lens: Rokinon 10mm F2.8 Ultra Wide Angle Lens
Sky: 84 images @ f/2.8, 10mm, ISO 1600, 30 sec.
Foreground: 1 image @ f/2.8, 10mm, ISO 1600, 1.6 sec
Anavi Uppal - Comet NEOWISE (Wide)
Date: July 19, 2020
Location: Orlando, Florida
Camera: Nikon D500.
Lens: Nikon AF-P DX NIKKOR 18-55mm f/3.5-5.6G VR
Acquisition Details: 12 images @ f/5.6, 55mm,
ISO 1600, 6 sec
Anavi Uppal - Comet NEOWISE (Zoomed)
Date: July 19, 2020
Location: Orlando, Florida
Camera: Nikon D500.
Lens: Nikon AF-P DX NIKKOR 18-55mm f/3.5-5.6G VR
Acquisition Details: 12 images @ f/5.6, 22mm,
ISO 1600, 8 sec
Owen Mitchell - Comet NEOWISE
Date: mid-July 2020
Location: Bozeman, Montana
Equipment: William Optics Redcat 51
Camera Details: Canon SL2 at 250mm on an i
Optron SkyGuider Pro
Acquisition Details: Stacked exposures worth 2 minutes
Owen Mitchell - Milky Way Galaxy
Date: Summer 2019
Location: Etscorn Campus Observatory at
New Mexico Tech
Camera Details: Canon SL2 with a tiki on 14mm lens on
an iOptron SkyGuider Pro
Acquisition Details: 25 seconds
Nathan Sunbury - The Ring Nebula (Messier 57)
Date: Summer 2016
Location: Sommers-Bausch Oservatory,
University of Colorado Boulder
56
Nathan Sunbury - The Moon
Date: Summer 2016
Location: Sommers-Bausch Observatory,
University of Colorado Boulder
SkyShot Autumn 2020
Cameron Woo - Pleiades (Messier 45)
Date: January 20, 2020
Location: New Jersey
Camera Details: AF-S Nikkor 55-300mm f/4.5-5.6G
ED DX lens at 300mm, f/5.6, and 1 second
Acquisition Details: 160 light frames at ISO 6400,
18 darks, 22 bias, and 18 flats
Software: stacking performed in SiriL in MacOS,
editing performed in Affinity Photo
Cameron Woo - The Summer Triangle Asterism
Date: August 11, 2020
Location: New Jersey
Camera and Acquisition Details:
Rokinon 16mm f/2.0 lens for crop sensor
Nikon cameras at f/4.0, 25 seconds, ISO 3200
Additional Details: 130 light frames, 20 darks,
20 bias, and 18 flats
Software: stacking performed in Deep Sky Stacker on
Windows 10, editing performed in Affinity Photo
Cameron Woo - Milky Way Galaxy
Date: August 30, 2016
Location: Kaanapali in Maui in bortle 3-4 skies
Camera Details: Nikon AF-S Nikkor 35mm f/1.8G DX lens
Acquisition Details: f/1.8, at ISO 800 or 1600
for 5-8 seconds
Software: frames stacked in Deep Sky Stacker on
Windows 10, edited in Affinity Photo
Jonah Rolfness - Sadr Region
Mount: Ioptron IEQ30 Pro
Camera: ASI1600mm
Filters: Astronomik 6nm Ha, OIII
Optics: Rokinon 85mm F1.4 lens
Acquisition Details: 600x120sec Ha shot at F1.4
234x120sec OIII shot at F1.4
Preprocessing:
WBPP Script used to calibrate all lights with
darks, flats, and flat darks
SubframeSelector used to apply various weights to each frame
StarAlignment used to register each channel
ImageIntegration used to create masters for each channel,
ESD Rejection Algorithm
Postprocessing:
DynamicCrop and alignment of each master channel
DBE to remove gradients
Linear Fit channels
MLT Noise Reduction on each channel
Ha very aggressively stretched by STF to increase contrast
OIII given normal STF stretch
Pixelmath used to combine to a HOO palette
Various curve transformations on saturation and
RGB brightness
LHE and MLT Bias applied for sharpening
Aggressive star reduction applied using starnet and
morphological transformation
Clonestamp tool used to salvage crescent and tulip nebulas
after starnet ate them
DarkStructureEnhance Script
Cameron Woo - The Orion Nebula (Messier 42)
Date: late December 2019
Location: Bergen County suburbs 30 minutes from
Manhattan in bortle 8 skies
Camera Details: AF-S Nikkor 55-300mm f/4.5-5.6G
ED DX lens at 300mm, f/5.6, 1 or 1.3 seconds
Acquisition Details: 183 frames at ISO 3200, 70 frames at
ISO 6400, 100 at ISO 12800
Additional Details: 14 flat frames, 36 dark frames,
42 bias frames
Software: stacking performed in SiriL in MacOS,
editing performed in Affinity Photo
57
SkyShot Autumn 2020
58
Jonah Rolfness - The Orion Nebula (Messier 42) and
Running Man Nebula (Sh2-279)
Mount: Ioptron IEQ30 Pro
Camera: ASI1600MMC
Filters: ZWO LRBG, Astronomik Ha, SII, OIII
Telescope: GSO 6in F/5 Newtonian Reflector
Autoguider: QHY5L-II-M paired with the
Orion 60mm guide scope
Acquisition Details:
Panel 1:
-40x5, 41x60, 38x300 Ha
-52x120 R
-50x120 G
-48x120 B
Panel 2:
-38x5, 41x60, 42x300 Ha
-51x120 R
-50x120 G
-49x120 B
Software: PIXINSIGHT
Kappa Sigma Stacking in DSS to deal with geo sats
HDR combine on Ha stacks for HDR Luminance
MMT Noise Reduction
Histogram stretch
-HDR Transformation to bring the core to proper levels
-Color Combination
-Curve Transformations to bring out red colors,
lower background RBG, and saturation
-Luminance addition
-ACNDR in post-linear state to further reduce noise
-SCNR to reduce green hue
-Star Reduction
Jonah Rolfness - The Pinwheel Galaxy (Messier 101)
Mount: Ioptron IEQ30 Pro
Camera: ASI1600MMC
Filters: Ha/SII/OIII Astronomik 6nm and ZWO LRGB
Telescope: GSO 6in F/5 Newtonian Reflector
Autoguider: QHY5L-II-M paired with the
Orion 60mm guide scope
Acquisition Details:
Roughly 800x120sec Luminance
200x120sec RGB
450x120sec Ha
Preprocessing:
Stacked and calibrated using DSS Kappa Sigma Clipping
Postprocessing:
DynamicCrop and alignment of each master channel
DBE to remove gradients
Deconvolution script applied to bring out detail in galaxy core
MLT Noise Reduction on each channel
LRGB Combination to create final master with STF stretch
Separate STF for Ha channel
Various curve transformations on RGB brightness
and saturation
Added in Ha layer using pixelmath to
brighten the red nebula in the galaxy
Jonah Rolfness - The Heart Nebula (IC 1805) - Fish Head
Nebula (IC 1795) and Melotte 15 Mosaic
Mount: Ioptron IEQ30 Pro
Camera: ASI1600MMC
Filters: Ha/SII/OIII Astronomik 6nm
Telescope: GSO 6in F/5 Newtonian Reflector
Autoguider: QHY5L-II-M paired with the
Orion 60mm guide scope
Acquisition Details:
Panel 1: 300x12, 1800x3 Ha; 300x11 OIII; 300x10 SII
Panel 2: 300x21, 1800x4 Ha; 300x24 OIII; 300x20 SII
Panel 3: 300x22, 1800x4 Ha; 300x22 OIII; 300x16 SII
Panel 4: 300x23, 1800x4 Ha; 300x24 OIII; 300x25 SII
Processing- Stacking:
Used DSS to create master lights for short Ha, long Ha, OIII,
and SII
Stacked both long and short Ha to create a master Ha light.
Appropriate darks and flats were used
Pixinsight:
DynamicCrop and DBE on all 12 stacks(Ha, OIII, SII)
StarAlignment used to create a rough mosaic for each filter
GradientMergeMosiac and DNALinearFit used to create a
final mosaic for each filter.
Noise reduction using MultiscaleLinearTransform applied for
each master frame.
ChannelCombination used to create a master SHO image.
SCNR green applied for both regular and inverted(magenta
star reduction)
Many different curve transformations, boosting saturation,
shifting hue, and reducing RBG background levels.
TGVDenoise and ACNDR applied to further eliminate noise
SkyShot Autumn 2020
Educational Opportunities
Scholarships
The Science Ambassador Scholarship: A full–tuition scholarship
for a woman in science, technology, engineering, or math. Funded
by Cards Against Humanity. Open to high school and undergraduate
students. Applications close December 14th, 2020 at 11:59PM CST.
Richard Holland Memorial Scholarship: Open to high school and
undergraduate students. Applications will be accepted online starting
January 1, 2021 through March 15, 2021.
The Gladys Carol Scholarship Program, Inc: Open to high school
seniors, high school graduates, and current undergraduate level students
who are United States citizens or permanent residents. Application
process opens on January 1, 2021.
SBB Research Group STEM Scholarship: Available to currently
enrolled full-time students pursuing a STEM degree. Awarded on a
quarterly basis in 2021; the next application deadline is February 28,
2021.
CC Bank’s Young Scholars Scholarship: Each year CC Bank’s
Young Scholars Scholarship offers up to five $2,000 scholarships to
students attending universities, colleges, and other academic institutions
across the U.S. Applicants must apply by Thursday, December
31, 2020, to get the scholarship during the 2021-2022 academic year.
Lockheed Martin STEM Scholarship: For high school seniors and
current college freshmen or sophomores.
Women in Aerospace Scholarship
Women in Technology Scholarship (WITS) Program
Google Scholarships for Computer Science
Annual Collabera STEM Scholarship
Regeneron Science Talent Search
The Gates Scholarship
Programs and Internships
Summer Science Program in Astrophysics: Open to rising juniors
and seniors in high school. Applications open on December 15, 2020.
California State Summer School for Mathematics and Science
(COSMOS): Open to students in grades 8-12. Applications due in early
2021.
NASA SEES: Open to high school students. Applications due in
early 2021.
Yale Summer Program in Astrophysics (YSPA): Open to rising high
school seniors. On temporary hiatus for 2021, will reopen in 2022.
Jet Propulsion Laboratory Summer Internship Program: Open
to undergraduate and graduate students pursuing degrees in STEM
fields. Applications due on March 31, 2021.
Boston University Research in Science & Engineering Program
(BU RISE): Open to rising high school seniors. Applications open on
December 15, 2020.
Research Science Institute by the Center for Excellence in Education
(RSI): Open to rising high school seniors. Applications due on
January 16, 2021.
Science Internship Program at UC Santa Cruz
Research Mentorship Program at UC Santa Barbara
Kopernik Observatory & Science Center High School Internship
Program
Alfred University Astronomy High School Institute
Space Telescope Science Institute Space Astronomy Summer Program
APL Johns Hopkins STEM Academy
International Opportunities
Work Experience at the Australian National University: The observatory offers a limited number of work experience places to
year 10, 11 and 12 students each year. These placements are typically 1 week in duration and the students work on an astronomical
project under the supervision of professional astronomers.
International Astronomy Summer Internship Program: The summer internship is designed for pupils in the final years of high
school, or those who have just finished high school. During their three-week stay, participants work on a variety of astrophysical
observations and experiments.
59
SkyShot Autumn 2020
Contributor Biographies
Priti Rangnekar is a freshman at Stanford University,
majoring in computer science and engineering physics.
She has researched asteroid orbit determination at
the 2019 Summer Science Program, conducted a datadriven
astrophysics Senior Project at BASIS Independent
Silicon Valley, and analyzes exoplanet transits through
a collaborative project with AAVSO observers. As a
7-time national qualifier and Tournament of Champions
quarterfinalist in speech and debate, as well as a scientific
writer for student journals, she champions the value of
logical thinking and effective communication in a variety
of fields. She has received recognition in international
competitions for computing and business, and she enjoys
conducting STEM outreach. She seeks to connect fellow
students around the world while fostering knowledge
and hands-on exploration of the universe in an inclusive,
engaging community.
Rutvik Marathe is a freshman at the University
of Michigan, majoring in aerospace engineering. An
ardent space enthusiast, Rutvik has conducted asteroid
orbit determination research at the Summer Science
Program in 2019, pursues astrophotography, and has
independently studied topics in orbital mechanics and
chaos theory. In addition to space-related endeavors, he
has earned recognition at the state level in varsity debate
and competitive mathematics, as well as at the national
level in programming. With expertise in team leadership
and teaching STEM, he strives to promote curiosity and
interest for the universe and space exploration through
SkyShot.
Naunet Leonhardes-Barboza is a young Costa Rican-
American woman planning to major in astrophysics and
computer science at Wellesley College. She has experience
volunteering for The Rockland Astronomy Club and for
Girls’ World Expo as a Birght Ideas/Science Coordinator.
An alum of the 2019 Summer Science Program in
Astrophysics, she has learned and researched in both the
Stull Observatory and Sommers-Bausch Observatory. She
loves spending my free time writing poetry about space
and further exploring her interest in astronomy through
all mediums.
Carter Moyer is a freshman at Harvey Mudd College,
majoring in engineering and political science. He is also an
alum of the 2019 Summer Science Program in Astrophysics.
Vighnesh Nagpal is a freshman at UC Berkeley hoping
to pursue a double major in Physics and Astrophysics.
He is fascinated by everything ‘space’, having gained
experience doing research as part of the Summer Science
Program in Astrophysics and his experiences working with
scientists at Caltech on exoplanet research. He hopes to
continue exploring and learning about the exciting state
of astronomy today.
Ezgi Zeren is a freshman at Tufts University, majoring in
mechanical engineering. She hails from Istanbul, Turkey.
Although it is difficult to see the stars and all other celestial
objects from Istanbul’s crowded streets, she considers
herself lucky to participate in SSP at CU Boulder in 2019.
“At the Sommers-Bausch Observatory, I looked through a
telescope first time in my life. Since then, I try to share
what I explore looking at the sky with others, who want to
see and know more.”
Anavi Uppal is a freshman at Yale University who plans
on double majoring in astrophysics and computer science.
She is an alum of the 2019 Yale Summer Program in
Astrophysics, where she researched the newly discovered
supernova SN 2019hyk. She was an intern at the 2018
NASA STEM Enhancement in Earth Science (SEES)
Program, where she helped design a lunar habitat and lunar
mission. Anavi greatly enjoys participating in astronomy
outreach, and hopes to inspire others to see the beauty in
our universe. She often volunteers at astronomy nights
and occasionally gives talks to the public. Anavi has been
doing astrophotography for five years, and specializes in
landscape astrophotography. Her work can be viewed on
Instagram at @anaviuppal.
Victoria Lu is a freshman at Yale University. She is
prospectively double majoring in art and evolutionary
biology. She is an alum of the 2019 Summer Science
Program in Astrophysics. Victoria seeks to connect global
communities on contemporary issues, such as climate
change and conservation, through scientific research and
education.
60
SkyShot Autumn 2020
Alexandra Masegian is a second-year student at
the University of Chicago, where she is majoring in
astrophysics and minoring in data science and creative
writing. Her primary interests lie in stellar and extragalactic
astrophysics, and she has been an astrophysics research
intern at the University of California, Santa Cruz, the
American Museum of Natural History, and Fermi National
Accelerator Laboratory. She is also an alum of the 2018
NASA STEM Enhancement in Earth Science (SEES)
Program, where she was a member of the Explore the Moon
team. Alex is passionate about science communication
and outreach, and hopes to spend her career broadening
humanity’s knowledge of the vast and beautiful universe
we live in. Her work can be found in SkyShot itself, as
well as in The Spectrum, a UChicago-based e-publication
about science and society.
Andrew Tran is a second-year student at the University
of Georgia majoring in astrophysics and minoring in
mathematics. He has been involved in many facets of the
astrophysics community, as a former NASA intern, as an
undergraduate researcher in the Department of Physics
and Astronomy at his school, and as the creator and founder
of Astrophysics Feed, a science media page on Instagram
(@astrophysicsfeed). In his spare time, Andrew likes
astrophotography, reading books about space or physics,
and learning anything about the world and universe.
Sofia Fausone is from Northern California. She is
currently a first year at Yale and hopes to double major in
physics and math. She’s especially interested in theoretical
physics and is excited to explore different areas of each
field.
Timothy Hein is a Computer Engineering Major at
Purdue University. Though his passions include technical
design, coffee, and classical literature, he plans on pursuing
a career in early stage venture capital.
Abby Kinney is a freshman at Williams College interested
in studying physics. Originally from Washington, she is an
alum of the 2019 Summer Science Program in Astrophysics.
In her free time, she enjoys observing the night sky.
Owen Mitchell is a college freshman at Johns Hopkins
University. Originally from Montana, he is an alum of the
2019 Summer Science Program in Astrophysics at New
Mexico Tech.
Jonah Rolfness is a college freshman at the California
Institute of Technology. Originally from Arizona, he is an
alum of the 2019 Summer Science Program in Astrophysics
at New Mexico Tech.
Feli is a high school senior from Ohio interested in
astronomy and astrophysics. They are an alumnus of the
2020 Summer Science Program in Astrophysics, where
they tracked the path of a near-Earth asteroid. In their free
time, they enjoy reading, stargazing, and learning more
about space!
Ryan Caputo is a freshman at the University of Colorado
Boulder. Originally from Texas, he is an alum of the 2019
Summer Science Program in Astrophysics.
Alex Dong is a first-year student at Yale from Canada,
having recently graduated with a bilingual high school
diploma and the AP International Diploma accolade. He
is also an alum of the 2019 Summer Science Program in
Biochemistry. At Yale, he is planning to major in Molecular
Biophysics and Biochemistry, although his passion for
poetry—especially themed around astronomy—will
continue to occupy his spare time.
Nathan Sunbury is a senior at Harvey Mudd College
hailing from California. He is an alum of the 2016 Summer
Science Program at the University of Colorado Boulder.
Sydney Wang is from Dalian, China and went to
the Webb Schools. He currently studies physics at The
University of Pennsylvania.
Cameron Woo is a freshman at the University of
Pennsylvania in the School of Engineering and Applied
Science. Originally from New Jersey, he is an alum of the
2019 Summer Science Program in Astrophysics at the
University of Colorado Boulder.
Wilson Zheng hails from Shanghai, China and a high
school senior at Berkshire School. He is also an alum of the
2020 Summer Science Program in Astrophysics.
61
Autumn 2020
62
skyshotonline.com