YSM Issue 85.4
Transform your PDFs into Flipbooks and boost your revenue!
Leverage SEO-optimized Flipbooks, powerful backlinks, and multimedia content to professionally showcase your products and significantly increase your reach.
Yale Scientific
Established in 1894
THE NATION’S OLDEST COLLEGE SCIENCE PUBLICATION
October 2012
Vol. 85 No. 4
COMPETING
VISIONS
Will science sway your vote?
PAGES 20-22
The link between protein
expression and alcoholism
propensity in Mus musculus
Sizing Up Autism
The correlation between
autism spectrum disorders
and abnormal head growth
Pluripotent Politics
How the challenges of
federally-sponsored stem cell
research impact scientists
PAGE 6 PAGES 16-17 PAGES 20-22
everyday Q&A
Q&A
What is the Higgs Boson?
The science behind the “God particle.”
On July 4, 2012, scientists at
the European Organization for
Nuclear Research — the world’s
largest particle physics laboratory,
also known by its French
acronym CERN — announced
the discovery of a particle likely
to be the long-sought-after Higgs
boson. Putting aside the public
hype of the Higgs as a “God
particle,” what exactly does this
discovery mean?
The Higgs is the last expected
piece of a theory called the Standard
Model, which essentially
describes how the universe works
on a subatomic scale. It is by the
Standard Model, for example,
that we can manipulate electrons
precisely enough to see individual atoms through microscopes.
In spite of all it can explain, however, the Standard Model has
a missing piece.
The model predicts that electromagnetism and the weak
force, which causes nuclei to decay, are really two sides of the
same coin and that these forces are composed of counterpart
particles: photons, and W and Z bosons. Because of the
analogous, correspondent nature of the two forces, scientists
Q&A
BY ANDREW DEVEAU
Have you ever thought about what space could smell like?
Describing the stench of space as “sulfurous” and “metallic,”
astronauts have reported scents similar to seared steak, hot metal,
and gunpowder. There is no firm consensus, but these accounts
can agree on one thing: space stinks.
According to astronauts, the smell is not apparent when wearing
a spacesuit: an odor only emerges after removing their suits in the
safety of the space shuttle. Scientists think that this alien aroma
may arise because particles outside in space continue to cling to
spacesuits inside the space shuttle. These particles, likely polycyclic
aromatic hydrocarbons that form in space during combustion
after a star’s death, mix with air in the space shuttle to create a
distinctive smell.
The odor has become so notorious that NASA is trying to
recreate the smell of space back on Earth for astronaut-training
purposes. With the help of a scent chemist who recreated the
scent of the Mir space station for an art exhibition, NASA hopes
the scent of space itself will be recreated just as successfully. Until
then, the smell remains just another elusive mystery of outer space.
An artist’s approximation of a collision of two protons
that produce a Higgs. Courtesy of CERN National
Geographic.
What Does Space Smell Like?
A whiff of the final frontier.
BY AHMED ANSARI
expected the W and Z bosons to
be massless like the photon, but
these particles were surprisingly
found to have significant mass,
leaving researchers puzzled by
inconsistency.
The addition of the Higgs to
the Standard Model accounts
for the seemingly inexplicable
mass of W and Z bosons: Higgs
particles create a field or “fluid”
(known as Higgs Condensate)
through which all particles in
existence continually travel. This
field acts as a barrier to movement
and thus varying degrees
of resistance, depending on
the properties of the particle in
question, contribute to perceived
mass. Some particles, like W and Z bosons, slow down in the
Higgs field, while other particles, like photons, zip through it
unaffected, massless and minute.
Although this model serves as a strong theory, the existence
of the Higgs has been difficult to prove — until this summer’s
discovery. While CERN researchers have yet to confirm the
discovered particle’s identity indefinitely, it is very likely the
missing piece in our physical understanding of the universe.
Particles in space may mix with air in the space shuttle to
create a distinct “space” smell. Courtesy of Acclaim Images.
2 Yale Scientific Magazine | October 2012 www.yalescientific.org
NEWS
5
6
6
7
7
8
9
10
11
Letter from the Editor
New Grant For Cancer Research
Alcoholism in Mice
Professor Joan Steitz Honored as Role
Model for Women in Science
Professor Horwich Receives Shaw Prize
Yale Professor Fact Checks Nature
Natural Birth Induces Protein Linked
to Brain Development
On The Road to Attenuating
Neuropathic Pain
Stress and Ecosystems
FEATURES
26
28
30
Bioethics
Pluripotent Politics
Micobiology
Exploring the Microbiome
Politics
The Politics of FDA Approval
contents
October 2012 / Vol. 85 / Issue No. 4
12
20
The Latest Superstar
Couple: Closing in on
Kepler-36
Meet the galaxy's hottest new superstar
couple, Kepler 36b and Kepler 36c. Professor
Sarbani Basu of Yale's Astronomy
Department is part of a team that just
discovered Kepler 36, the first planetary
group comprised of planets of strikingly
different densities and compositions so
close in proximity to each other.
ON THE COVER
Competing Visions: Will Science Swing Your Vote?
With the election one week away, which issues do you care about?
This article covers the underlying science, current facts, and each
candidate’s stance towards the two science issues Yale undergraduates
found most important: alternative energy and climate change.
31
32
33
34
Technology
3D Organ Printing
Health
MythBusters: Taste Mapping
Astronomy
NASA Rover Curiosity Cruises Onto
Mars
Education
The Story of Science at Yale, Part III:
Science at Yale on the Horizon
14
Curing the "Bubble
Boy"
Yale researchers on a multi-institutional
team have demonstrated that gene therapy
is a viable option for treatment of adenosine
deaminase severe combined immunodeficiency
(ADA-SCID). Professor Meffre
of the Department of Immunobiology
explains this disease and the options for its
treatment.
is History: Microprobe
analyses decipher the identity of
18This
metallurgical artifacts
35
36
37
39
Book Review
The Vision Revolution
Zoology
Laughter Across the Animal Kingdom,
From Rats to Humans
Alumni Profile
Jennifer Staple-Clark '03
Cartoon
A Different Political Campaign
16
Sizing Up Autism
Several studies over the past decade
have linked head overgrowth to autism,
and a recent study by Dr. Katarzyna
Chawarska at the Yale Child Study Center
shows the correlation between autism and
overgrowth in general. This research aims
to further understand the biological underpinnings
of autism and ultimately advance
to the development of treatment or a cure
for the disorder.
23
Carfilzomib: The Latest
Triumph of Targeted
Therapies Development
www.yalescientific.org
October 2012 | Yale Scientific Magazine 3
THEME
“
Only by ensuring that scientific data is never distorted
or concealed to serve a political agenda, making
scientific decisions based on facts, not ideology, and
including the public in our decision making process
will we harness the power of science to achieve our
goals – to preserve our environment and protect our
national security; to create the jobs of the future,
and live longer, healthier lives.”
“
The federal government has a vital role to play
in conducting sound science and making the
resulting data available.”
I will ensure that the best available scientific
and technical information guides decisionmaking
in my Administration, and avoid the
manipulation of science for political gain.”
“Climate change is the one of the biggest issues of
this generation, and we have to meet this challenge
by driving smart policies that lead to greater growth
in clean energy generation and result in a range of
economic and social benefits.
“A crucial component of my plan for a stronger
middle class is to dramatically increase domestic
energy production and partner closely with
Canada and Mexico to achieve North American
energy independence by 2020.”
— Barack Obama — Mitt Romney
Though they declined a science debate,
both Barack Obama and Mitt Romney
answered 14 questions from Scientific
American and ScienceDebate.org
“regarding some of the biggest
scientific and technological
challenges facing the nation.”
Above are some quotes from their
respective answers. To read
their full responses
and more about
each candidate’s
stance on science
issues, visit www.
sciencedebate.org.
october 2012
neuroscience medicine bioethics
Sizing Up AUtiSm
Vol. 85 No. 4
plUripotent politicS
A history of the challenges of
federally sponsored stem cell
research and what it means for
scientists
October 2012 Volume 85 No. 4
Editor-in-Chief
Publisher
Managing Editors
Articles Editors
News Editor
Features Editor
Copy Editors
Production Manager
Layout Editors
Arts Editor
Online Editor
Multimedia Editor
Advertising Manager
Distribution Manager
Subscriptions Manager
Outreach Chair
Special Events Coordinator
Staff
Daniel Arias
Andrew Deveau
Andrew Goldstein
Walter Hsiang
Bridget Kiely
Katie Leiby
Kaitlin McLean
Contributors
Shaunak Bakshi
Grace Cao
Katie Colford
Kirsten Dowling
Arash Fereydooni
Selin Isguvin
Sophie Janaskie
Yale Scientific
M A G A Z I N E
Established 1894
William Zhang
Elizabeth Asai
Jonathan Hwang
Robyn Shaffer
Nancy Huynh
Shirlee Wohl
Mansur Ghani
Renee Wu
Ike Lee
Jessica Hahne
Li Boynton
Jessica Schmerler
John Urwin
Jeremy Puthumana
Jonathan Liang
Chukwuma Onyebeke
Stella Cao
Naaman Mehta
Karthikeyan Ardhanareeswaran
Lara Boyle
Mary Labowsky
Theresa Oei
Terin Patel-Wilson
Rebecca Su
Nicole Tsai
Elisa Visher
Dennis Wang
Jason Young
Jennifer Ky
Jared Milford
Meredith Redick
Josephine Smit
Rebecca Su
Ike Swetiltz
Joyce Xi
Advisory Board
Sean Barrett, Chair
Physics
Priyamvada Natarajan
Astronomy
Kurt Zilm
Chemistry
Fred Volkmar
Child Study Center
Stanley Eisenstat
Computer Science
James Duncan
Diagnostic Radiology
Melinda Smith
Ecology & Evolutionary Biology
Peter Kindlmann
Electrical Engineering
Werner Wolf
Emeritus
John Wettlaufer
Geology & Geophysics
William Summers History of Science & History of Medicine
Jeremiah Quinlan
Undergraduate Admissions
Carl Seefried Yale Science & Engineering Association
F R O M T H E E D I T O R
SCIENCE & POLITICS
Scientists and politicians have always had their disagreements. From the post-World War II atomic
weapon disputes to the current storm over stem cells, the views of researchers and policy makers have
clashed throughout the course of modern history. But perhaps the situation now is more urgent than ever.
As politics continues to polarize, citizens have only become further entrenched in their values, often
losing sight of the hard facts supported by quantitative data and evidence. Surgeon Generals have been
victim to a trend of diminishing jurisdiction over public health measures, while politicians cave to expectations
of preventing science from “interfering” with their values — a stark contrast from the past
with presidents, such as John F. Kennedy, who needed to assure that religion and other biases would not
cloud their role as leaders.
And this is not solely an issue of political preference: both Democrats and Republicans are guilty of
these acts, “massaging” statistics and vacillating until scientific facts are distorted, twisted, and politicized.
While it may be easy to peg a certain party as pro- or anti-science, many politicians will highlight evidence
that favors their point and downplay dissenting data depending on the issue. Whether it is a blatant dismissal
or a lack of understanding, science clearly warrants serious attention in political decisions. Some
hold steadfast to the notion that science should solely inform and not create policy, but it is clear that
such a belief has introduced gaping issues with interpretation — and scientists are the experts that can
bridge the gap between misguided politics and the scientific truth.
Today, several major challenges of our nation grapple with science: global warming, abortion, stem cell
research, resource usage, and energy independence, among many others. And as scientific advancements
naturally make science increasingly pervasive and relevant to everyday life, voters need to keep politicians
accountable for sound science policy. In light of the upcoming election, the Yale Scientific explores the
intersection of science and politics. While a considerable amount of current research is directed toward
developing medication and innovating therapeutics, the novel scientific applications cannot reach patients
without the necessary approval by the FDA. Dr. Joseph Ross, Assistant Professor of General Internal
Medicine at the Yale School of Medicine, is an expert on federal medical policy and has studied the efficiency
of the FDA compared it its international counterparts. Testament to the high-quality screen of
this regulatory process is the reported success and acclaim of the recently-approved, groundbreaking
targeted therapy for multiple myeloma developed in the laboratory of Craig Crews, the Lewis B. Cullman
Professor of Molecular, Cellular, and Developmental Biology at Yale. In addition, this issue features
science policy that Yale undergraduates ranked most relevant to them and covers the underlying science,
current facts, and each presidential candidate’s stance on these issues.
As election day fast approaches, it may be interesting to recognize that democratic elections essentially
boil down to a scientific experiment as we come together to decide which variables to tweak based from
past observations to produce desired results. In fact, as several scholars have noted, the founding fathers
of our country often referred to our nascent nation as an experiment as they crafted the untested system
with a healthy splash of equality, liberty, and order. And although few believed that our nation would
persist, the results — like those of science experiments — were unexpected, as democracy emerged as
arguably the most stable and sustainable form of government to date. By casting our votes with science
in mind, we are not only emphasizing the importance of science in politics but perhaps also ultimately
honoring the very foundation our nation was founded on.
William Zhang
Editor-in-Chief
About the Art
The Yale Scientific Magazine (YSM) is published four times a year by
Yale Scientific Publications, Inc. Third class postage paid in New
Haven, CT 06520. Non-profit postage permit number 01106 paid
for May 19, 1927 under the act of August 1912. ISN:0091-287.
We reserve the right to edit any submissions, solicited or unsolicited,
for publication. This magazine is published by Yale College
students, and Yale University is not responsible for its contents.
Perspectives expressed by authors do not necessarily reflect the
opinions of YSM. We retain the right to reprint contributions,
both text and graphics, in future issues as well as a non-exclusive
right to reproduce these in electronic form. The YSM welcomes
comments and feedback. Letters to the editor should be under
200 words and should include the author’s name and contact
information. We reserve the right to edit letters before publication.
Please send questions and comments to ysm@yale.edu.
The QuesT for speech
Exploring treatment for children with autism
page 17
Yale Scientific
Established in 1894
The NaTioN’s oldesT College sCieNCe PubliCaTioN
Smirnoff “mice”
Protein Expression Linked
to Alcoholism Propensity in
Mus musculus
Competing
Visions
The Correlation Between
Autism and Abnormal
Growth
Will science sway your vote?
PAGES 20-22
PAGE 6 PAGES 16-17 PAGES 20-22
The cover, designed by guest artist David Yu, puts the 2012 presidential
candidates into the shoes of scientists as they plan their campaigns. Votes,
like experimental data, are the results of many complex processes and
factors. It does not take a research scientist to realize the importance of
scientific topics, not only in this campaign but for the future of our country.
In the April 2012 issue of the Yale Scientific, we inadvertently ran an image of the
ESET Android without giving proper attribution. ESET is a global provider of antivirus
software with 100 million users and offices around the world. While our arts team took
some creative liberties with the android image, we would like to give ESET full credit for
its artwork. ESET is an ardent supporter of education and academic research, and we
would like to express our appreciation to ESET for allowing us to correct the record.
CHEMISTRY
New Grant to Investigate Novel
Approaches in Cancer Research
BY JAYANTH KRISHNAN
The National Cancer Institute (NCI) recently awarded
two Yale Professors of Chemistry, Alanna Schepartz and
Andy Phillips, a $2.5 million research grant to investigate
unique ways to inhibit protein targets that are currently
considered “undruggable” and
are responsible for the malignancy
of certain cancers. Such
protein targets could include transcription
factors, protein-protein
interactions and kinases. This
highly competitive grant was won
through the NCI’s Provocative
Questions program, which challenges
scientists to work on crucial
problems that are often neglected.
Schepartz and Phillips, along
with chemist Dylan Taatjes of
the University of Colorado, have
selected a project that attempts
to correct dysfunctional regulation of the crucial tumor
suppressor p53. Commonly called the guardian of the
NEUROSCIENCE
Smirnoff “Mice”: Protein Expression Linked to
Alcoholism Propensity in Mus musculus
BY JASON YOUNG
Andy Phillips (left) and Alanna Schepartz
(right) have been awarded $2.5 million from
the National Cancer Institute. Courtesy of
Dr. Phillips and Dr. Schepartz.
Yale Professor of Psychiatry Jane Taylor and her graduate
student Jacqueline Barker have identified some of
the first innate biochemical differences
underlying alcoholismrelated
behavior in the brains
of mice.
Because researchers have found
it difficult to distinguish between
the innate biological or neurological
factors for alcohol abuse
propensity and the changes in
the brain caused by heavy alcohol
consumption, the underlying neurological
causes for alcoholismrelated
behavior remain largely
unknown. However, recent studies
have linked addictive behavior
and alcohol consumption to changes in neural plasticity,
the adaptability of the brain to respond to neuronal
activity.
Barker performed a series of tests to first identify and
isolate mice that demonstrated vulnerability to addictive
behaviors before exposure to alcohol. She and Professor
Differences in the brains of mice have
been linked to alcoholism vulnerability.
Courtesy of Wikicommons.
genome, p53 is the transcription factor that prevents cells
from becoming oncogenic. Schepartz and Phillips want
to apply new techniques in chemical biology to develop
novel molecules that can gain entry to the cell and properly
regulate the expression of genes
affiliated with p53 to prevent cells
from becoming malignant, which
has been and continues to be a
major goal of this field.
Many scientists regard modern
cancer research as highly empirical,
yet dependent on trial and error.
Scientists often combine various
drugs and chemotherapeutic agents
at several different molar concentrations
to test if their concoction
induces apoptosis, or programmed
cell death, to malignant cells. The
research Schepartz and Phillips
plan to carry out is a hypothesis-driven approach based
on the belief that the best cure for cancer is prevention.
Taylor hypothesized that differences in the prevalence
of molecules relating to neural plasticity in a part of the
brain known as the ventromedial
prefrontal cortex (vmPFC) would
cause differences in vulnerability to
alcohol-addiction-related behavior.
The study found that the modified
neural protein PSA-NCAM, which
is involved in learning and memory,
is causally linked to the extinction of
alcohol-seeking behavior. This protein
was expressed at lower levels
in the vmPFCs of mice at risk for
alcoholism-related behavior than in
mice that were not.
Further studies examining sex differences
and targets regulating gene
expression are currently underway. Taylor hopes that in
the future, the findings can be applied to humans to help
identify alcohol addiction and related disorders. “This
work will allow us to find genes and molecules that might
put people at risk,” Taylor says.
This research was supported by the NIAAA.
6 Yale Scientific Magazine | October 2012 www.yalescientific.org
Professor Joan Steitz Honored As
Role Model For Women Scientists
Professor Joan Steitz, Sterling Professor of Molecular
Biophysics and Biochemistry, recently received two major
awards for her achievements as a woman scientist.
Steitz received the Pearl Meister Greengard Prize, an
award given by Rockefeller University
that recognizes outstanding achievements
in science and aims to combat
discrimination against women in the
field. She also received the 2012 Vanderbilt
Prize in Biomedical Science for her
“stellar record of research accomplishments”
and pioneering work that has
helped develop the current understanding
of RNA and its role in health and
disease. As part of the award, Steitz will
visit the Vanderbilt University School of
Medicine to deliver a talk as a part of
the Flexner Discovery Lecture Series,
which features guest lectures from “the world’s most
eminent scientists.”
Despite these recent accolades, among many others,
BY ANNY DOW
Professor Horwich Receives Prestigious Shaw
Prize
Dr. Arthur L Horwich, Professor of Genetics and Pediatrics
at the Yale School of Medicine, was recently awarded
the Shaw Prize for elucidating
the role of chaperonins in
protein folding. This prize was
shared with collaborator Ulrich
Hartl from Max Planck Institute,
Martinsreid, Germany.
The one million-dollar Shaw
Prize, awarded in Hong Kong,
distinguishes scientific breakthroughs
throughout the world
that enhance societal progress.
Three prizes are given annually,
one each in the categories of
Astronomy, Life Science and
Medicine, and Mathematical Sciences. The Life Science and
Medicine award was presented to Horwich in September
for his research revealing important mechanisms of protein
folding. His research found that the protective environment
created for proteins by chaperonins allows for efficient and
undisturbed formation.
After first publishing this research in 1989, Hor-
Professor Joan Steitz’ lecture.
Courtesy of Professor Steitz.
BY SMITA SHUKLA
Horwich receiving Shaw Prize from Hong
Kong’s Chief Executive C.Y. Leung.
Courtesy of Dr. Horwich.
WOMEN IN SCIENCE
Steitz was not always certain about her career path. When
she was a college student in the 1960s, there were very
few women in science. Inspired by the work she did under
cell biologist Joseph Gall, she pursued studies at Harvard
University and worked with influential
scientists, such as Nobel Prize winner
James D. Watson, against the odds. During
her career, she has conducted research on
how RNA operates in both bacteria and
vertebrate cells and is now considered one
of the leading scientists in her field.
“The most rewarding part of [being a
scientist] is working in the lab,” Steitz said.
“You discover and figure out things that
nobody ever knew because of the experiments
that you’ve done. I think that’s the
biggest thrill for all scientists.”
Steitz is passionate about her work
and is currently researching different types of noncoding
RNA-protein complexes, providing critical insights into
viruses and disease states.
BIOLOGY
wich became a Howard Hughes Medical Institute Investigator.
Over the past twenty years, Horwich has worked
with biochemical reconstitution,
X-ray crystallography, and
genetic manipulation techniques
to build upon his initial findings.
This early discovery holds great
importance for research today
on complications associated with
protein malformation.
Currently, Horwich’s lab is
studying neurodegenerative diseases
that are associated with
protein malformation. Indeed,
Horwich posed a question that
demonstrates the new goal of
his laboratory: “If molecular chaperones [are] so good at
helping our proteins fold, then why do we have neurodegenerative
diseases?” Horwich hopes that studying protein
folding in neurons will help develop the understanding of
medical issues such as amyotrophic lateral sclerosis, a disease
of the malformation in neurons which control voluntary
muscle movement.
www.yalescientific.org October 2012 | Yale Scientific Magazine 7
DATA ANALYSIS
Yale Professor Clears Olympic Swimming Controversy
Almost immediately after Nature released its controversial piece
on Chinese Olympic swimmer Ye Shiwen, titled “Why great
Olympic feats raise suspicions,” the response letters began to
flood in, including one written by Yale Associate Professor of
Molecular, Cellular, and Developmental Biology, Weimin Zhong.
In the Nature article, Ewen Callaway contributed to the ongoing
debate about the integrity of Ye’s world-record-breaking
performance in the 2012 London Olympics by arguing that the
16-year-old’s “anomalous” race, while not proof of foul play,
certainly raised concerns. He claimed that not only did Ye beat
her personal best in the 400 IM by more than seven seconds, but
she also out-swam American gold-medalist Ryan Lochte in the
last 50 meters of their respective 400 IM races. Within three days
of publication, Callaway’s article had drawn what Nature editors
called an “extraordinary level of outraged response.”
In China, scientists and other readers of Nature, angered by
the piece, contacted the Chinese Biological Investigators Society
(CBIS), a group of scientists who collaborate on scientific
research, projects, and other miscellaneous issues.
Zhong, Vice-President of the CBIS, was aware of the controversy
surrounding Ye’s swim but had not given it much attention
until the group was contacted by Chinese scientists. “This is
the first case in which we got involved in something that’s not
scientific,” Zhong said, “but analyzing data is scientific, and the
problem is that they selectively took out data. That’s a no-no in
science.”
Zhong worked with his colleagues Hao Wu and Linheng Li, professors
from Harvard and Stowers Institute for Medical Research,
Ye Shiwen celebrates after winning the women’s 400 IM at the London 2012
Olympics. Courtesy of The Guardian.
BY BRENDAN SHI
Courtesy of Nature.
respectively, to expose this scientific fallacy. In their response letter,
which Nature would later publish as “Some facts about Ye Shiwen’s
swim,” they demonstrated that Ye’s previous personal best was actually
only five seconds slower than her world record time, compared
to the seven-second difference that Nature reported.
These researchers then compared Ye’s improvement
with the improvements of other young swimmers,
finding that, historically, many other prominent teenage
swimmers had improved their times by five or more
seconds, including Ian Thorpe, Adrian Moorhouse,
and Elizabeth Beisel.
Finally, they challenged Callaway’s comparison of
Ye and Lochte’s times in the last 50 meters of their
respective races, questioning Callaway’s decision to
omit the fact that Lochte finished his 400 IM an entire
23 seconds faster than Ye Shiwen or that Lochte was
not even close to being the fastest man in the last 50
meters. Zhong recognized that it would be easy to label
Nature’s article as biased. “Oftentimes, it’s not very easy
to tell the difference between incompetence and bias.
I don’t think there’s any ill-will — it looks to me like
the Nature reporter was just lazy,” he said.
It was that laziness and Nature’s failure to adhere to
higher standards of factual reporting that drew him
into the controversy in the first place. “If we aren’t
sure about things, we tend not to write anything about
it,” he said, “but in this case, the facts are very easy to
find, and once you look at the facts there’s absolutely
no basis for the accusation.”
8 Yale Scientific Magazine | October 2012 www.yalescientific.org
NEUROBIOLOGY
Natural Birth Induces Protein Linked to Brain
Development
BY ALYSSA PICARD
A team of researchers led by Tamas Horvath, Professor of Comparative
Medicine, Neurobiology, and Obstetrics and Gynecology at
the Yale School of Medicine, has found that natural birth, not delivery
by Caesarian section (C-section), stimulates production of a protein
important for brain development.
The discovery was a chance finding for the researchers, who were
initially investigating the role that mitochondrial uncoupling protein
2 — or UCP2 — plays in the brain. It is believed that the main role
of the protein is to enable cells to metabolize fat, a key process for
the adaptability of cells. The researchers were studying this protein
in post-natal mice and looked for differences in expression between
in utero and ex utero as a control. “What we noticed was that if the
animal was naturally born, the expression of the UCP2 was significantly
higher than if the pup was taken from C-section,” Horvath explains,
“and that’s when the whole thing started to be interesting for us.”
Method of birth may play a role in how one’s brain develops
into adulthood. Courtesy of CNN.
of the procedure. “If there is a biological process that makes a difference
between C-sections and natural birth, it will at least be good
to know and see how [one] can take that into consideration,” he says.
Horvath also points to the possibility that future studies could reveal
the threshold of medical need for women to have a C-section.
In future studies, Horvath plans to investigate the triggers of UCP2
induction to determine why it occurs with natural birth, which could
lead to the development of methods to mimic it through surgical
means. He also plans to expand on the original finding with additional
studies on surgically and naturally born mice and eventually dogs. This
may include following their development and behavior throughout
their life, but the long-term goal remains to conduct studies in the
human population.
Lower levels of UCP2 lowered connectivity between hippocampal
neurons (mouse hippocampal neurons seen here). Courtesy
of Institute for Neurological Discoveries, Kansas University.
Once the team established the connection between higher levels of
UCP2 and natural birth, they turned their attention to the role that
UCP2 has on brain development. By inferring the UCP2 expression
using chemical methods and genetic suppression, the team discovered
that lower UCP2 expression resulted in less connectivity between the
neurons in the hippocampus — the part of the brain responsible for
developing memory and aiding the learning processes. Additionally,
mice with lower levels of UCP2 performed differently in simple tasks
associated with the hippocampus region. They exhibited less spatial
awareness, moved more slowly, and showed greater anxiety about
moving into open areas than did the mice with normal levels of UCP2.
The discovery comes at a time when many women, especially those
in the United States, are electing to have C-sections out of convenience
rather than medical necessity. Though the World Health Organization
recommends that the C-section rate be less than 10 percent to 15
percent, the rate in the U.S. has climbed from 4.5 percent in 1965 to
32 percent in 2007. While Horvath does not think this discovery will
halt the trend, he is hopeful that it will lead to a greater understanding
C-section rates have been climbing in the US in recent years.
Courtesy of CDC.
www.yalescientific.org October 2012 | Yale Scientific Magazine 9
NEUROSCIENCE
On The Road to Attenuating Neuropathic Pain
Neuropathic pain is a chronic
condition, often resulting from
prior conditions such as diabetes,
shingles, and traumatic injury due
to hyper excitability of certain
voltage-gated sodium channels. The
condition affects approximately
18 percent of the population, in
addition to being a considerable
cost burden to the United States
economy. And the symptoms can
be harrowing, from a shooting
and burning pain to tingling and
numbness. While treatments do
exist, they are accompanied by side
effects. For example, non-selective
sodium channel blockers are used
to treat chronic pain, but they can
affect several voltage-gated sodium
channels, thus causing severe unintentional
side effects. Omar Samad, Associate Research Scientist,
and Stephen Waxman, Professor of Neurology, Neurobiology, and
Pharmacology and Director of the Center for Neuroscience &
Regeneration Research at Yale University, have managed to attenuate
neuropathic pain caused by traumatic nerve injury in rats by
targeting the Na v
1.3 sodium channel. This novel gene knockdown
technique shows promise for finding a method for effective and
specific treatment.
Samad and colleagues chose to focus on the Na v
1.3 sodium channel
because their earlier studies indicated that its upregulation in
nerve-injured rats is coupled with neuropathic pain. To treat pain at
the source, the team utilized a peripheral Na v
1.3 gene knockdown
technique to gauge the therapeutic efficacy of targeting this channel.
Using computational algorithms, Samad searched for a molecule that
BY MAHBUBA TUSTY
Images of the dorsal root ganglion in the peripheral nervous system. (Left) Green marks the
neurons treated with the Na v
1.3 gene knockdown technique. (Middle) Red marks all of the
neurons in the image. (Right) The merged image shows the red-marked neurons that were
not affected by the gene knockdown technique, while the yellow/orange-marked neurons
shows the treated neurons. Courtesy of Dr. Omar Samad.
The graph demonstrates that Na v
1.3 gene therapy alleviates pain in rats with injured
nerves compared to no treatments on the injured rats. Courtesy of Dr. Omar Samad.
would block Na v
1.3 expression, and then experimentally identified
the two most promising “hits”. In order to simulate the conditions
that often cause neuropathic pain, the experimenters used rats that
had injured nerves. The two most promising molecules were then
delivered directly into the dorsal root ganglion of these experimental
rats, and then the rats were assessed for pain behavior.
This experimental approach showed very promising results. The
lab’s data showed that these specific molecules were successful in
attenuating the neuropathic pain caused by nerve injury in rats. This
finding served as a confirmation of the Na v
1.3 sodium channel’s
role in neuropathic pain. Importantly, the method was found to
be specific to the Na v
1.3 sodium channel, leaving other channels
unaffected and thus avoiding potential side effects.
Samad and colleagues have therefore shown that knocking down
Na v
1.3 has therapeutic benefits in their animal
model. With the results of this research, pharmaceutical
companies and research groups
can now direct their attention to working with
the Na v
1.3 sodium channel in humans. With
this approach, potential gene therapies can be
developed for treating chronic pain. By focusing
on this one sodium channel, medications
for neuropathic pain will have the potential to
be more effective and specific with fewer side
effects and greater patient usage.
This research was conducted in Dr. Stephen
Waxman’s laboratory. It was supported by the
Department of Veteran Affairs and the Nancy
Taylor Foundation for Chronic Diseases.
Other authors involved in the study were
Andrew M. Tan, Xiaoyang Cheng, Edmund
Foster, and Sulayman D. Dib-Hajj.
10 Yale Scientific Magazine | October 2012 www.yalescientific.org
BY KRISTEN DOWLING
ECOLOGY
Stress and Ecosystems:
Role of Predation Reconsidered in the Hunt for Stable Ecology
Ecosystems are losing predators faster than organisms in any
other trophic level, and this may be of even more concern for the
stability of nutrient cycles than previously thought. Recent research
at the Yale School of Forestry and Environmental Studies from
former postdoctoral associate Dr. Dror Hawlena and his colleagues
demonstrates how predators considerably influence belowground
community functions by instilling fear in their prey.
Stressed grasshoppers reared in the presence of spider predators
show elevated metabolisms and consequently have diets high in
energy-rich carbohydrates. In addition, stress hormones increase the
conversion of proteins to glucose, a process that excretes nitrogen.
These two reactions to stress increase the carbon –to-nitrogen ratio
in grasshoppers’ bodies.
Hawlena’s experiments tested the effects of this body chemistry
variation on the decomposition of plant litter. He reared grasshoppers
in the field with and without the risk of spider predation. “You
can manipulate the risk of predation without actually killing the
prey by gluing the mouthparts of spiders,” Hawlena explained. The
grasshopper carcasses were then allowed to decompose in laboratory
microcosms of their grassland habitat.
Plant litter was added to each of the microcosms, and the
researchers measured rates of carbon mineralization, the release
of carbon dioxide during microbial decomposition. Notably, the
mineralization of grass litter in soil where the stressed grasshoppers
decomposed was 62 percent lower than the soil with the
non-stressed grasshoppers. These results suggest a causative link
between the changes in prey body chemistry and altered functioning
Graphs A & B above show the cumulative carbon mineralization of the decomposition
of the grasshoppers and plant litter, respectively. Steps 1 and 2 show the
decomposition of grasshoppers and plant litter and the corresponding respiration
of carbon dioxide. Courtesy of Dr. Dror Hawlena.
The grasshopper herbivore Melanoplus femurrubrum. Courtesy
of Dr. Dror Hawlena.
of the soil microbial community. Hawlena hypothesizes that this
is due to nitrogen’s role in priming microbes to break down plant
litter. In the presence of lower levels of nitrogen, plant decomposition
decreases.
Conducted over a three-year period, variations on the experiment
included measuring decomposition rates in the field rather than in
the laboratory microcosm, and using artificial grasshopper carcasses
whose C:N ratios spanned greater variation than those observed
in nature. Collectively, the experiments’ results
suggest that predation exerts top-down, cascading
effects on plant decomposition in both
laboratory and field settings by fear-induced
changes in prey body chemistry.
Hawlena stresses the importance of these
findings. “We show that if you add a tiny
amount of animal you can change the way the
microbial community is processing resources.
Therefore even a small amount of animal detritus
can be extremely influential in regulating the
recycling rates of different nutrients.”
This research has major repercussions for
how scientists think about ecosystems. The
interconnections of every trophic level have
been underestimated, especially in terms of
top-down control and nutrient cycles. The
biological, chemical, and physical stability of
ecosystems is truly reliant on each member’s
contributions. The predators, whose role in
the ecosystem is more important than previously
expected, are disappearing at alarming
rates. Hawlena warns, “These results may be
an important hint that we need to take seriously
what we do to the predators.”
www.yalescientific.org October 2012 | Yale Scientific Magazine 11
The Latest Superstar Couple:
Closing in on Kepler-36
BY LI BOYNTON
Colored pixels represent relative flux increasing from
blue to green, with each row symbolizing an individual
transit light curve. The Kepler 36 planets’ curved bands
convey fluctuating periods, a product of their gravitational
pulls on each other. Courtesy of Science.
12 Yale Scientific Magazine | 2012
Imagine looking up at the night sky and
seeing a gas giant three times the size
of our ordinary moon. This is how
Kepler-36c, a newly discovered gaseous planet,
appears from the surface of Kepler-36b, its
rocky neighbor in the Kepler-36 planetary
system. The Kepler-36 system is the first
planetary group comprised of planets of such
strikingly contrasting densities and compositions.
“Are we alone in the universe?” This question
first captured the imaginations of scientists,
theologians, and philosophers hundreds
of years ago, and fuels the current exploration
of planets outside our solar system. As
of now, a total of 838 confirmed extra-solar
planets have been discovered. But Kepler-36
mystified astrophysicists who wondered how
two vastly different worlds ended up in such
close orbit. As Dr. Sarbani Basu of the Department
of Astronomy at Yale explained, “The
pivotal moment in our project was when we
figured out that these two very
different planets are orbiting
around the same star. That’s
when we started to look at the
system as a whole, rather than
as just two more extra-solar
planets.” Extra-solar planets
are nothing new to astronomers
like Basu, but strikingly
different planets in close proximity
presented scientists with
a yet-undiscovered type of
planetary system.
“One of These Things is Not Like
the Other”
Modern astronomy hypothesizes
that large gas-giant
planets cannot form close to
their host stars because stellar
wind would blow away most of the surrounding
gas in the disc, causing the planet to lose
mass quickly. Therefore, this discovery of a
massive Jupiter-like extra-solar planet, or exoplanet,
near a main-sequence star has opened
new horizons to studying planetary formation.
When researchers studying this system first
landed on the data relayed back from NASA’s
Kepler spacecraft, they were no longer surprised
to see the gas giant Kepler-36c close to
its parent star. They were surprised, however,
to see planets as different as Earth-like Kepler-
36b and gas-giant-like Kepler-36c coexisting
within such close orbital planes. The discovery
of diverse planets separated by a mere 0.013
AU revolutionized astronomical thinking on
how planetary systems form and evolve.
In our solar system, there is a clear differentiation
between rocky and gaseous planets;
the former are confined to the inner part of
our system, and the latter found in the outer
parts. This trend is in accord with condensation
theory, which hypothesizes that interstellar
dust is an essential ingredient in the formation
of planets. Areas closer to the sun are at higher
temperatures and are home to the hotter, rocky
planets. Farther away from the sun, colder
temperatures allow gases to move slowly and
are thus more affected by gravity. As a result,
interstellar dust grains come together to form
the foundation of gas giants.
The detection of giant planets close to other
www.yalescientific.org
ASTRONOMY
stars proves that this pattern is not universal;
planetary orbits can indeed change significantly
after their formation. Basu’s international team
of approximately 40 scientists from five different
countries set out to uncover the intricacies
of this anomaly: the Kepler-36 system and its
two starkly contrasting planetary members.
Kepler-36b, nicknamed a “Super-Earth,” is
rocky like our home planet but is a staggering
4.5 times more massive with a radius 1.5 times
greater than that of Earth. Kepler-36c, is a gaseous
planet 8.1 times more massive than Earth
with a radius 3.7 times greater. Kepler-36b and
Kepler-36c are 20 times more closely spaced
and have a larger density difference than any
adjacent pair of planets in our solar system.
Joining Forces: From Planetary Closeness to Scientific
Collaboration
The planetary subgroup of the team,
led by Josh Carter, a Hubble fellow at the
Harvard Smithsonian Center for Astrophysics,
discovered the larger planet and its host
star during a first quick look at data from
the Kepler spacecraft, a space observatory
launched by NASA to discover Earth-like
planets orbiting other stars. They noticed the
the larger planet Kepler-36c as it transited
in front of the host star, blocking some of
its light to give a characteristic dip or transit
signal. However, while Kepler planetary data
can tell you how big a planet is relative to its
star, it cannot determine how big this planet
is with certainty until the size of the star is
determined. This is where the stellar physics
subgroup that Basu is involved with came in.
Using asteroseismology, the study of stars by
observing their natural solar-like oscillations
that occur as a result of sound wave excitation
by turbulence in the star, they determined
the properties of the system’s parent star. By
measuring various oscillations, the team was
able to calculate the size, mass, and age of
the host star to exquisite precision. Once the
group gave the stellar data to the planetary
team, Carter and his colleagues were able to
further analyze Kepler-36c to discover its size
and composition.
The Power of the Human Eye
At first, the Kepler-36 team did not realize
Kepler-36c had company, let alone close
company. The smaller Kepler-36b planet is so
small that it does not leave much of a signature
in the amount of light it blocked from
Kepler-36a. Consequently,
it was thus
rejected by the automatic
code of the
Kepler data analysis
center, which makes
the usual assumption
of periodic orbits
and cannot detect
imprecise periods
of relatively smaller
planets such as
Kepler-36b’s. However,
Kepler-36b is
so remarkably close
to its massive neighbor
that it alters the
gravitational field
felt by the smaller
planet and changes
the strict period
nature of trasits.
This is known as
transit timing variation. Because this period
was not well determined, Carter’s team had
to collect data by hand instead of using the
usual programs. They used an algorithm
known as “quasi-periodic pulse detection” to
methodically check planetary systems already
in the Kepler data, and in this way stumbled
upon Kepler-36b. Basu emphasized that this
discovery would never have happened without
persistent manual follow-up to the team’s
practiced intuitions. “The major implication in
my mind is to not believe in automated pipelines.
Nothing can substitute for the human
eye. If Josh hadn’t looked at this system by
eye, we wouldn’t have known that there was
this second rocky planet sitting there.”
The curves above represent theoretical models for planets of
a given composition, with dotted curves modeling terrestrial
bodies and dashed curves modeling Earth-like solid cores surrounded
by hydrogen or helium envelopes. Courtesy of Science.
The Future of Kepler-36
About the Author
The discovery of the close two-planet
system has shed light on extreme violations
of traditional orbit-composition patterns.
The international team studying this system
galvanized the astronomy community with
greater interest in understanding how planets
with such different compositions can fall into
such astonishingly close orbits. Kepler-36b and
Kepler-36c have managed to achieve orbital
stability at fascinatingly close range. The team
has announced its determination to continue
analyzing more Kepler data to locate similar
planetary systems in the hopes of unearthing
similar close encounters.
Li Boynton is a junior in Calhoun College double majoring in Molecular, Cellular
and Developmental Biology and East Asian Studies. She is the Production Manager
for the Yale Scientific, and works in Dr. Anjelica Gonzalez’s lab using bioengineered
transmigration models to study immunological responses to inflammatory signals.
Acknowledgements
The author would like to thank Professor Sarbani Basu for sharing her knowledge
on astronomy and the Kepler-36 system.
Further Reading
• Carter, Joshua A., et al. “Kepler-36: A Pair of Planets with Neighboring Orbits
and Dissimilar Densities.” Science 337 (2012): 556-59. AAAS. Web. 25 Sept.
www.yalescientific.org October 2012 | Yale Scientific Magazine 13
Curing the “Bubble
Boy”
BY ELISA VISHER
Severe combined immunodeficiency
(SCID), also known as the “bubble boy
disease,” is a rare genetic disease that
affects people around the world. Characterized
by gross deficiencies in the immune system,
the disease is so dubbed colloquially because a
child with SCID in the 1970s and 80s famously
lived in a plastic bubble to protect himself from
opportunistic infections. He was the longestliving
child with SCID at the time, when the
plastic bubble was the only treatment option
available. While SCID used to be a fatal disease
with no real treatments, it can now be managed
by enzyme replacement and bone marrow
transplants. Promising research has recently
shown that gene therapy may soon replace
these traditional therapies.
The Immune System
David Vetter, the “Bubble Boy.” Courtesy
of Boston GRIO.
SCID generally affects the patient’s B and T
cells. B and T cells are the main components
of the adaptive immune system that works in
conjunction with the innate immune system to
protect the body from foreign pathogens. The
innate immune system recognizes and attacks
invading cells, which may destroy invading
pathogens but may not completely fight off
the infection alone. The adaptive system then
helps the innate by recognizing pathogens and
responding specifically to each one. In addition,
the adaptive immune system remembers each
pathogen it has encountered and can launch a
faster, more effective response against the same
pathogen in the future.
What is ADA-SCID?
One of the more common forms of SCID
is ADA-SCID, so named because the mutation
affects the formation of the enzyme adenosine
deaminase (ADA). Adenosine deaminase
breaks down byproducts of DNA formation
in the cell. When the enzyme is not working, as
in ADA-SCID patients, these byproducts accumulate
in the cell. This accumulation creates a
toxic environment that kills emerging immune
system cells before they can mature. With
almost no B or T cells, people with ADA-SCID
essentially have no adaptive immune system.
Treating ADA-SCID: Traditional Methods
Traditionally, ADA-SCID has been managed
with enzyme replacement therapy. Doctors
inject the ADA-SCID patients with the missing
adenosine deaminase, but this treatment can be
problematic. As Associate Professor of Immunobiology
Eric Meffre says, “You can detoxify
the cell serum, but it is not as efficient as healthy
systems because the enzyme that you give intravenously
does not get into the cells themselves.
Although it improves the patients’ conditions, it
still does not reconstitute the immune system.”
This treatment is not considered a cure, and
other options are usually explored as long-term
options for ADA-SCID patients.
One accepted cure for ADA-SCID is a bone
marrow transplant. If the transplant is accepted
by the patient’s immune system, no other treatments
are needed. However, donors must be
biological matches with recipients, which are
usually limited to family members, so marrow
transplants are often unavailable.
Treating ADA-SCID: New Options
Recent research has shown that gene therapy
may be a viable alternative to the traditional
transplant. Currently, it is available at a select
number of institutions including the National
Institutes of Health (NIH). So far, 20-30
individuals have been treated and all have
survived. A multi-institutional research team
led by Meffre is working to better understand
the implications of this treatment. To perform
gene therapy for ADA-SCID, doctors
first isolate stem cells from the patients. They
then transform these cells into functional
bone marrow cells by infecting the stem cells
14 Yale Scientific Magazine | October 2012
www.yalescientific.org
GENETICS
A
dsDNA
Insulin
LPS
ADA1
postGT- new emigrant B cells
ADA2
with D a retrovirus whose genome contains a
functional version of ADA. These cells then
have normal metabolic function and can be
re-injected into the patient, who will then
express normal ADA function and no longer
be immunodeficient. So far, ADA gene therapy
has been the only successful gene therapy trial.
Some patients have been followed for more
Figure 6
than ten years, and no significantly problematic
side effects have been observed. As Meffre
says, “The patients are alive and have normal
lives. The kids go to school.” Though not
life-threatening, autoimmune reactions have
sometimes been observed as a side effect from
ADA gene therapy.
ADA1- 35 ADA1- 17 ADA2- 231 ADA2- 245
Understanding the Side Effects
ADA3
Nonpolyreactive (%) Polyreactive (%)
Because the autoimmune side effects are not
fully understood, Meffre and his team examined
this phenomenon in greater detail. Meffre
says, “What we think is happening is that you
get a system that is just partially fixed. When
you do gene therapy, not all the stem cells in
gene therapy may be corrected. This means
that, at the end, you may still have a patient
that is a mosaic.” Meffre thinks that the toxic
accumulation of metabolites signals a receptor,
which then block receptors for autoreactive B
cells. When doctors artificially give adenosine
deaminase, B and T cells develop and proliferate,
but nearly all the B cells that exit the bone
marrow are autoreactive. This problem affects
patients treated with enzyme injections as well
as gene therapy patients.
B
Frequency of polyreactive
B cell clones (%)
Frequency of anti-nuclear
B cell clones (%)
40 *
Finding Children with SCID
ADA3- 25
SAUER et al. 2011: Figure 6
OD405
OD
4 405
4
OD 405
4 4
OD 405
3
3
3
30
2
2
2
20
1
1
1
0
0
0
10
0.01 0.1 1 0.01 0.1 1 0.01 0.1 1 µg/mL
4
4
OD 405
0
OD405
HD preGT postGT
3
3
3
(n=10) (n=3) (n=3)
2
2
2
C
n.s.
1
1
1
40 *
0
0
0
0.01 0.1 1 0.01 0.1 1 0.01 0.1 1 µg/mL
30
4
OD 405
4 OD405
4
OD 405
3
3
3
20
2
2
2
10
1
1
1
0
0
0
0
0.01 0.1 1 0.01 0.1 1 0.01 0.1 1 µg/mL
HD preGT postGT
(n=8) (n=3) (n=3)
Autoimmune 20 functions 17 of B cells in 15 three ADA SCID patients after gene therapy
25
24
34
80
83
85
versus that of a control (HD). Courtesy of Eric Meffre, Yale Department of Immunobiology.
ADA3- 40
Due to these potential side effects, gene
therapy is only considered for life-threatening
diseases that affect children early in life. Most
SCID patients are diagnosed at birth or shortly
after because of noticeable immune problems.
Some states mandate screening for SCID diseases
at birth, including Connecticut, where
Yale-New Haven Hospital serves as the center
for SCID screening. Early diagnosis also gives
doctors a longer window to look for a bone
marrow donor; there are no side effects from
bone marrow transplant if it occurs before the
child’s first birthday. Unfortunately, high costs
have prevented SCID blood screenings from
achieving universal adoption.
ADA gene therapy corrects central B-cell tolerance in ADA-SCID patients. (A) Antibodies expressed by new emigrant/transitional
(CD19 + CD10 + IgM hi CD27 - ) B cells from ADA-SCID patients were tested by ELISA for polyreactivity (tested against dsDNA,
insulin, and LPS). Dotted lines show ED38-positive control. Horizontal lines show cutoff OD405 for positive reactivity. For each
individual, the frequency of polyreactive and nonpolyreactive clones is summarized in pie charts, with the number of antibodies
tested indicated in the centers. (B) Decreased frequencies of polyreactive new emigrant/transitional B cells in ADA-SCID
patients after gene therapy although they remain significantly elevated compared to those in healthy donors; *p=0.05-0.005. (C)
ADA gene therapy completely restores the central removal of ANA-expressing developing B cells. The frequencies of antinuclear
new emigrant/transitional B cells in ADA-deficient patients after gene therapy are significantly decreased compared to
preGT and are comparable to those in healthy controls; *p=0.05-0.005; n.s.= not significant. In panels (B) and (C) each diamond
represents an individual, and the average is shown with a bar. The dashed horizontal lines corresponds to the average
frequency in HD of polyreactive and ANA-expressing B cells, respectively. (D) Antibodies expressed from ADA-deficient new
emigrant/transitional B cells show various patterns of cytoplasmic stainings and are devoid of nuclear reactivity. Two examples
for anti-cytoplasmic new emigrant B cells are shown for each patient.
*
The Future of Gene Therapy
About the Author
While ADA gene therapy has been initially
successful, Meffre says that expanding the
treatment to other diseases is not viable
until gene therapy technology improves. In
another gene therapy trial, doctors treated
four patients, and two of these patients died
from lymphoma. In this trial, overexpression
of the gene caused unrestrained proliferation
of T-cells and lymphoma. Overexpression
resulted because when the retrovirus infected
the genome with the corrected gene, it
inserted itself into a more active area of the
genome. Scientists cannot yet target the location
of this retroviral insertion and instead
use a completely random process. Technology
to target the retrovirus to infect specific
areas of the genome, preferably at the exact
place where the gene is normally located
in the human genome, must be developed
before gene therapy can be applied to other
diseases.
Despite these difficulties, gene therapy will
always be an attractive treatment possibility
for gene-related diseases because it is truly à
la carte medicine. Once technology is developed,
the treatment has the potential to cure
any genetic disease, even ones caused by rare
or de novo mutations. Of course, some diseases
will be easier to treat than others, such
as those in which the body tissue of interest
(in this case, bone marrow) is easier to isolate
and target. Clinical trials for ADA-SCID have
provided the first successful, safe use of this
exciting technology in the process of saving
lives. For those still struggling with other
genetic diseases, treatment is not yet possible,
but relief may soon come.
Elisa Visher is a junior in Trumbull College double majoring in Anthropology and
Biology. She works in the Yale Molecular Anthropology Lab and Paul Turner Lab.
Acknowledgements
The author would like to thank Professor Eric Meffre for his time, patience, and
comprehensive explanation of ADA-SCID and its treatment.
Further Reading
• Sauer, AV, H Morbach, I Brigida, A Aiuti, and E Meffre. “Defective B cell tolerance
in adenosine deaminase deficiency is corrected by gene therapy.” Journal of
Clinical Medicine 2012; 122 (6): 2141–2152.
www.yalescientific.org October 2012 | Yale Scientific Magazine 15
Sizing Up Autism
The Correlation Between Autism and Abnormal Growth
BY JESSICA SCHMERLER
The most exciting day in most parents’
lives is when their child says “mommy”
or “daddy” for the very first time. But
what if one day, their seemingly happy and
healthy toddler suddenly lost the ability to communicate?
Or worse, if their child never began
to speak at all? This is the fate of the parents
of the thousands of children diagnosed with
autism each year. Currently one in a hundred
children born are diagnosed with disorders
in the autism spectrum (ASD), four boys to
every girl. The diagnosis encompasses a broad
range of symptoms, which present themselves
as any form of difficulty in pretend play, social
interaction, or verbal/non-verbal communication.
Evidence available today suggests that
the increasingly broad definition applied to
“autism” may be complicating the opportunities
for effective treatment, ultimately creating more
issues than it resolves.
As defined by the National Institute of
Health (NIH), autism is “a developmental
disorder that appears in the first 3 years of life,
and affects the brain’s normal development
of social and communication skills.” Children
can be diagnosed as having ASD if they fail
to meet any of the typical markers of normal
development, including babbling by 12 months,
gesturing by 12 months, saying single words
by 16 months, saying two-word spontaneous
phrases by 24 months, and losing any language
or social skills at any age.
The Etiology and Phenotypes of Autism
The classification ASD can be applied to
an enormous array of developmental delays,
16 Yale Scientific Magazine | October 2012
ASD is associated with an increased
TCV. Courtesy of the Simons Foundation
Autism Research Initiative.
which in turn generates obstacles to study the
disorder. The past few decades have seen an
increased incidence of autism, often linked to
the broadening of the scope of the diagnosis.
The NIH notes that a child diagnosed with
high-functioning autism today would most
likely have been considered merely strange
30 years ago. The problem that arises with the
present characterization of autism is that one
child who loses his ability to speak and another
child who makes repetitive hand motions
would both be diagnosed with ASD. The
biological process leading to loss of speech
is most likely different than the one leading
to repetitive gestures, and yet both would be
considered causes of autism. In other words,
the major barrier posed to study and discovery
of the etiology of autism is the myriad
subsets of the designation “autism” and, in all
likelihood, multiple genetic or environmental
origins of the disorder.
Elucidating the distinctions between subsets
of autism and correspondingly developing
a more specific etiology of each would
allow for far more individualized treatment
of children diagnosed with ASD. Many studies
have attempted to investigate the variety
of ASD phenotypes, including the UC Davis
MIND Institute that piloted the Autism
Phenome Project (APP) to study the distinct
phenotypes associated with autism. The
APP is a longitudinal study that establishes
differences in medical evaluations, environmental
exposure/epidemiology, behavior/
neuropsychology, genomics, brain structure
and immune function.
One identified phenotype associated with
autism is abnormally large Total Cerebral
Volume (TCV) and, correspondingly, Head
Circumference (HC) – collectively called
macrocephaly. Researchers at Yale University’s
Child Study Center have undertaken studies
in the connectivity of growth and neural
development to assess risk and predict developmental
phenotype of young boys through
growth measurement. A group of 184 boys
aged birth to 24 months, composed of 55
typically developing controls, 64 with ASD,
34 with Pervasive Developmental Disorder
Not Otherwise Specified (PDD-NOS), 13
with global developmental delays, and 18 with
other developmental problems, was analyzed
for head circumference, height, weight, and
social, verbal and cognitive functioning.
Boys with autism were significantly taller by
4.8 months, had a larger HC by 9.5 months,
weighed more by 11.4 months, were in the
top ten percent in size in infancy (correlated
with lower adaptive functioning and social
www.yalescientific.org
NEUROLOGY
Neurological data is collected using brain MRIs. Courtesy of popsci.com.
deficits), and showed accelerated HC growth
in the first year of life.
Making Sense of the Research
The results of this study engender several
important questions regarding the correlation
between overgrowth and autism. First,
whether abnormal growth is a symptom
or cause of autism in young boys remains
unknown. Symptoms are usually detected
between the ages of 12 and 24 months, but
neurological changes have been detected
at as early as four months of age. Since
the growth rate diverges at four months,
whether divergence of neurological function
or overgrowth occurs first is unclear. The
genetic underpinnings of divergent growth
and symptoms of ASD are also unknown,
so establishing a correlation is difficult. If the
etiologies were better understood, researchers
could study the source of both divergent head
growth and onset of autism symptoms and
any associated clinical implications of these
genetic mutations.
Dr. Katarzyna Chawarska of the Yale Child
Study Center explains, “When overgrowth
was first described 10 years ago, the idea was
that it could be used to predict autism. That’s
not true, the phenomenon also happens,
albeit much more rarely, in typical children,
and there are some children with autism who
don’t express head overgrowth.” Based on
the results of her study, Chawarska proposes
three hypotheses regarding the correlation of
overgrowth and autism: overgrowth coincides
with, precedes, or follows development of
symptoms. To understand the correlation
between physical abnormalities and autism
would serve not only to identify warning
signs for the development of the disease but
also allow for treatments that could prevent
progression of autism or improve the quality
of life for those affected.
Two significant characteristics set the Yale
study apart from others that have addressed
the connection of autism to TCV/HC. First,
past studies have investigated only divergent
head growth; the Yale study associates
general overgrowth with autism. Chawarska
says, “This phenomenon of overgrowth is
not restricted to neural tissue. This implication
might shift the emphasis [of research]
from seeking causes of overgrowth in purely
neural mechanisms to other growth factors
(hormones) that affect growth across multiple
types of tissues.” Other tissues affected
by overgrowth include skeletal and muscular
tissues. The second important difference is
that Yale’s study did not find any significant
difference between boys with regressive and
non-regressive forms of autism. Many other
studies have implicated regression as a major
distinguishing factor in the development
of macrocephaly. According to Chawarska,
regression is a very variable, subjective qualification,
as the way regression is measured
is not standardized. Most other studies use
retrospective parental reports of regression,
rather than observing the “regression” itself.
Chawarska makes the distinction, “what is
called regression we define as a plateau; the
child stops progressing.”
While the genetic basis of this observed
overgrowth still frustrates experts, researchers
at the Yale Child Study Center recognize that
expression of autism is not solely dependent
on genetics. Rather, a significant connection
between neural development and the environment
has been established as well. For children
expressing early symptoms of autism, the
center aims to “enhance their social experience,
hoping that by enriching repertoire of
skills [they] can enrich their experience, and
this is an enormous component in neural
development.” Research continues to be
conducted to uncover the molecular etiology
of the autistic phenotypes, but until such
a discovery is made, the development of
individualized behavioral interventions may
potentially lessen the severity of ASD.
About the Author
Jessica Schmerler is a sophomore Molecular, Cellular & Developmental Biology
major in Jonathan Edwards College. She is a Layout Editor for the Yale Scientific
Magazine as well as the Layout Editor and Staff Writer for the Yale Journal of
Medicine and Law.
Acknowledgements
The author would like to thank Dr. Katarzyna Chawarska of the Yale Child Study
Center for her time and in-depth description of her research.
Further Reading
• Chawarska K, Campbell D, Chen L, Shic F, Klin A, Chang J. Early Generalized
Overgrowth in Boys With Autism. Arch Gen Psychiatry. 68:1021-1031(2011).
www.yalescientific.org October 2012 | Yale Scientific Magazine 17
This is History
Microprobe Analyses Decipher Identities of Metallurgical Artifacts
BY THERESA OEI
Indiana, we are simply passing through
history. This, this is history.” Belloq’s line
referring to the Ark of the Covenant
from Raiders of the Lost Ark highlights the role
of artifacts in understanding historical truth.
Technologies such as element mapping and
microanalysis techniques enable archaeologists
to play historical detectives, deciphering
artifacts to contribute to our knowledge of the
past. According to Dr. Robert Gordon, a Yale
professor in Geology and Geophysics, “The
intellectual challenge of examining the artifacts
transmits a historical record and enlightens our
understanding of people’s interaction with
materials and the natural
world.” Artifacts stand as
silent and irrefutable witnesses
to history and the
technology of their eras.
Encountering the Artifact
The work of Gordon and
Colin Thomas, a graduate
student in Yale’s Anthropology
department, exemplifies
the archaeological
approach to discovering
historical truth. Archaeologist
Dr. Richard Hunter,
Founder and President of
Hunter Research Historical
Resource Consultants, was commissioned to
work on an excavation behind the state house
in New Jersey. There, under centuries of debris,
he found an 18 th century metallurgical site,
which he believed was a furnace for converting
iron into steel. During the excavation, the
foundations of a furnace house were revealed,
but the most interesting artifacts were what
appeared to be iron bars. These iron bars were
sent to Thomas and Gordon for identification
and confirmation.
Gordon’s work began in a “typical” way: “I
have something that looks unpromising like bits
of iron or slag, and I ask how old? What does
it tell of the technological ability or scientific
understanding of its time?” The multistep process
of extracting data from the artifact begins
with an examination of its size and shape. Then
the microstructure is examined under a light
microscope and probe. Based on knowledge of
the technology of its era of origin, the artifact
can be identified by comparing it to a reference
collection of metal artifacts, the only one of
which in the United States is housed at Yale.
Microstructure Analysis
The excavation site in Trenton, New Jersey. Courtesy of Dr. Robert Gordon.
The metal bar artifact was examined by
a microstructure analysis
technique that maps the elements
found in the microstructure
and identifies the
type of iron contained in
the artifact. For example,
sulfur, silicon, and carbon
are expected to appear
in cast iron. Mapping for
oxygen shows the amount
of oxidative corrosion that
has taken place in the iron.
In tandem with scanning
electron microscopy, energy
dispersive X-ray spectrometry
(EDS) creates elemental
maps by focusing a beam
of electrons on the artifact,
18 Yale Scientific Magazine | October 2012
www.yalescientific.org
ARCHAEOLOGY
which consequently releases an x-ray spectrum.
The spectrum lines can be identified and their
intensity measured to determine the concentration
of the elements in the sample. Here, EDS
analysis determined the composition of the
external oxidation products, and wavelength
dispersive analysis (which measures the concentration
of selected elements) quantified
the metal’s constituents, enabling Gordon and
Thomas to decipher the metal’s microstructure.
After analzing both the internal metal and
the external oxidation product, the microstructure
of the metal itself proved that it was
not wrought (pure) iron as suspected. Instead,
carbon graphite flakes suggested the artifact
was cast iron. The microstructure had a consistent
iron nitride precipitate along its metal
matrix, a product formed during the cooling
of metal. This indicated that the piece of cast
iron had been subjected to repetitive heating
and cooling, which suggests that the artifact
was part of the furnace itself. Non-metallic
elements were also identified in the sample, the
most abundant being iron oxide. Iron oxide is a
product of an internal oxidation reaction that
occurs over time, replacing the graphite flakes
of cast iron.
The artifact was then compared with a reference
specimen, which came from a pair of
andirons used to hold wood fuel in a fireplace.
X-ray mapping and imaging showed that the
reference and artifact had similar structures.
While both had graphite flakes, those in the
artifact had been replaced mostly with iron
oxide. Iron oxide has a low uniform iron concentration,
and the artifact did not show any
concentration gradients in iron or oxygen at
the border between the metal and the reaction
product. In contrast, iron in the reference specimen
was diluted in the areas surrounding the
graphite flakes by the increased oxygen that was
contributing to the reaction. The comparison
showed that both the artifact and reference
were undergoing the same internal oxidation
reaction. However, this reaction was still
underway in the reference specimen whereas it
had already reached equilibrium in the artifact.
Final Identity
All these analyses identified the artifact as a
grate bar from the firebox in the furnace. The
cast iron of the furnace bars is particularly
susceptible to internal oxidation because of
oxygen solubility in the metal, elements with
oxygen affinity (silicon, manganese, and phosphorus),
and the high diffusion rate of oxygen.
The iron bar artifact received at Gordon’s lab. Courtesy of Dr. Robert Gordon.
The fire of the furnace provides oxygen, and
the high temperature increases the speed of
oxygen diffusion. Internal oxidation degraded
the grate bar, but technology recovered its
original composition and identity.
History of the Cementation Furnace
The knowledge extracted from the artifact
can be placed in a historical context, enriching
our understanding of colonial America.
The excavation site was confirmed as a steel
cementation furnace built in the 1740s used to
convert iron into steel, a complex process that
was being developed in Germany and Great
Britain at the time.
The excavation site belonged to the Trenton
Steel Works, which was built on Petty’s Run, a
tributary of the Delaware River. Between 1745
and 1750, Benjamin Yard built the cementation
furnace — one of four of its kind built
in the U.S — marking a radical move for
colonial economic independence. Great Britain
responded by passing the Iron Act in 1750,
which attempted to limit the development of
iron manufacturing in the U.S. The colonists
wanted cheap sources of steel for cheaper
production for agricultural and military purposes;
Great Britain’s effort to stifle economic
progress, including iron manufacturing, in the
colonies was a large factor leading to the Declaration
of Independence.
Through their use of scientific techniques,
Gordon and Thomas were able to enhance
our understanding of 18 th century America.
Artifacts are the doors to historical truth, and
scientists like Gordon and Thomas hold the
key to unlocking this potential.
About the Author
Theresa Oei is a sophomore Molecular Biophysics and Biochemistry major in
Pierson college. She is on the board of Synapse, Yale Scientific Magazine’s Outreach
Program, and works in Professor Steitz’s lab studying target genes of the viral miR-
NAs HSUR4 and HSUR5 for their role in tumorigenesis.
Acknowledgements
The author would like to thank Doctor Gordon for his time and inspiring dedication
to archaeological discovery and research.
Further Reading
• Gordon, Robert. “Process Deduced from Ironmaking Wastes and Artifacts.” Journal
of Archaeological Science. Vol 24 (1997): 9-18.
www.yalescientific.org October 2012 | Yale Scientific Magazine 19
Competing Visions:
Will Science Swing Your Vote?
By John Urwin
One week. In one week we shall wake
up, go to school or work, and carve
out fifteen minutes to participate in
what has perhaps become the ultimate symbol
of modernity: free elections. For months now
we have been inundated with commercials,
speeches, and debates all telling us what to
do on this one day: the day we choose our
next president.
This year’s campaigns have emphasized vital
areas of interest like economic recovery, the
national debt, and healthcare. However, many
other important issues have fallen through the
cracks. Among these is science policy.
The reason why is clear. We value science,
but it just does not seem as pressing as other
issues. In a recent poll conducted by the Yale
Scientific Magazine, a majority of Yale undergraduates
deemed science policy important
(x = 3.43, on a ten-point scale) in determining
their vote for president. Yet, when asked
to rank science policy against other topics,
science policy was consistently ranked as
less important (x=6.23) than almost every
other election topic. Moreover, a majority
of Yale undergraduates replied that they
believed themselves to be either “not very well
informed” or only “somewhat well informed”
about the current science issues facing
America. This needs to change. Presented
here are the underlying science, the current
facts, and each candidate’s stance on the two
science issues that surveyed Yale students
regarded most important: alternative energy
and climate change.
Alternative Energy
At the heart of every power plant is a spinning
wheel, known as a turbine. In slightly
more technical terms, every power plant produces
electric current the same way: by rotating
a loop of wire through a magnetic field.
This technique is ubiquitous for all power
plants, including coal, natural gas, petroleum,
nuclear, hydro, wind, geothermal, and solar
plants. It is only in how they spin this turbine
that they differ.
Windmills let gusts of wind spin their massive
blades, while hydropower plants (dams)
use the force of flowing water to rotate
turbines. All other types of common power
plants — coal, natural gas, petroleum, nuclear,
solar, geothermal — use the same technique
to rotate their turbines: they boil water.
Each power plant boils water either by
generating heat via an exothermal process
(combustion for coal, petroleum, and natural
gas; nuclear fission for nuclear) or collecting
it from the environment (solar, geothermal).
The heat from these processes is used to
convert water into highly pressurized steam.
This superheated steam, which can reach temperatures
of 1,000 °F and pressures of 3,500
pounds per square foot, is then fed through
pipes to a turbine, thereby spinning it. This
flow of pressurized gas generates electric
current for routine functions.
Despite their shared mechanism, not all
power plants affect the environment equally.
Fossil-fuel-burning plants (those that combust
coal, petroleum, and natural gas) are environmentally
harmful, producing vast amounts
of carbon dioxide and significant quantities
of other pollutants. Moreover, fossil fuels
20 Yale Scientific Magazine | October 2012 www.yalescientific.org
POLITICS
Schematic of a nuclear power plant’s steam generator. Courtesy of Free Info Society.
take millions of years to form and exist in
limited quantities. Nuclear plants, on the other
hand, produce only small quantities of waste
(millions of times less by weight than coal
combustion), but the radioactive waste they
produce is highly toxic and persists almost
indefinitely in waste dumps (it can take millions
of years for even half of the waste to
decay).
By greater contrast, hydro, wind, solar, and
geothermal power plants have renewable
sources and do not emit significant amounts
of pollutants. These environmental considerations
are important not only to maintain
clean air, ensure sanitary water, and preserve
nature but also to address and ideally resist the
dawning realities of global warming.
A Look At The Numbers
Last year, the U.S. consumed 97.30 quadrillion
kJ of energy. Assuming you consume
2,000 calories per day, that’s enough energy to
‘power’ you for the next 30,100,000,000,000
years.
The vast majority (82 percent) of this
energy comes from burning fossil fuels. Petroleum
(36 percent), natural gas (26 percent), and
coal (20 percent) make up nearly all of this.
In the U.S., only 16.6% of the total energy
used comes from “clean” energy sources. Of
this, 63.6% comes from nuclear power plants,
24.4% is from hydropower plants, 9.0% is
from windmills, and the rest (3.0%) is a combination
of solar and geothermal.
The U.S. lags internationally in the use of
alternative energy. In 2010, for instance, over
45% of German electricity came from either
nuclear or renewable sources.
Two Visions
Sources and consumption of energy in the United States. Courtesy
of the US EIA Annual Energy Review 2011.
While both Mitt Romney and Barack
Obama center their energy policies on achieving
energy independence
by increasing
domestic production,
they differ in how
they want to achieve
this goal.
Broadly speaking,
the Republican
platform aims to
decrease oil imports
by letting the private
sector develop the
U.S.’s own natural
reserves. There are
undeniably ample
quantities of fossil
fuel within the U.S.
(237 trillion tons of
US primary energy production based on
source from 1949-2011. Courtesy of the US
EIA Annual Energy Review 2011.
coal, 219 billion barrels of oil, and just under
2 trillion cubic feet of natural gas), although
these supplies are not unlimited. Overly
aggressive exploitation of these reserves could
cause environmental damage, and a continued
dependence on coal and natural gas power
plants would exacerbate global warming.
By comparison, the Democratic platform
is to actively encourage the development of
alternative energies while moving the emphasis
away from fossil fuels. This approach is
costly; nuclear power plants are five-to-ten
times more expensive to build than coal
plants,, though their long-term operating costs
are comparable.
For each party, more detailed plans, available
online (see Further Reading), are both more
nuanced and more convergent than is commonly
acknowledged. The most important
questions we face do not ask “which one,”
but “to what degree.”
Climate Change
By this point, we’ve all heard of climate
change. There is a lot of carbon dioxide in
the atmosphere (390 ppm). There used to be
way less before industrialization (<300ppm).
We keep producing it (4.76 metric tons per
person globally). Fossil fuels are to blame (29
additional gigatons of CO2 per year). The
earth gets hotter (0.014°F per year). Oceans
rise, polar bears drown, and Norwegians go
to the beach. These facts are understood, but
the underlying science is far more interesting
and rarely discussed.
www.yalescientific.org October 2012 | Yale Scientific Magazine 21
POLITICS
the rays cannot travel directly to
the surface because they must
pass through the atmosphere
first. Atmospheric gases absorb
radiation and emit it in all directions,
so some of the radiation is
‘trapped’ within the atmosphere.
This effect simultaneously slows
the rate of cooling of the planet
and increases the amount of heat
absorbed. Both of these raise the
average temperature. This effect
is not subtle; if Earth lacked an
atmosphere, it would be around
60°F colder than it is today.
To fully understand global
warming, we must also acknowledge
that not all gases are equal;
some gases absorb the sun’s
rays readily, while others don’t.
Those gases that easily absorb
solar energy are known as greenhouse
gases. The more greenhouse
gases there are in the
atmosphere, the more heat the
earth retains and the warmer the
planet will be. This, in short, is
global warming.
Each year, the U.S. produces
5,500,000 tons of carbon dioxide.
Nearly all of this comes
from burning fossil fuels: 32
percent comes from coal plants
alone, 30 percent comes from
transportation, 17 percent comes
from heavy industry, and 7
percent comes from non-coal
combustion power plants. The
remaining 14 percent comes
from various sources.
Compared to other modern-
ized countries, the U.S. does poorly. We are
the second largest producer of carbon dioxide,
trailing only China. And while U.S. emissions
are decreasing, we remain behind the curve.
Two Visions
Let’s clarify one aspect right away: both
candidates acknowledge global warming
as real and influenced by man. They differ,
however, in the degree to which they believe
man has caused global warming and how they
wish to address it.
As previously discussed, the same principles
that applied to alternative energy also
apply to global warming. Republicans tend to
prefer a more economical, less environmental
approach, while Democrats prefer a more
environmental, less economical approach.
While Barack Obama favors an active
approach to reducing carbon emissions, Mitt
Romney views such a stance as a hindrance
to the economy and instead prefers loosening
environmental regulations.
One Week, One Click
On Tuesday, November 6, 2012, we will
elect our next president. With alternative
energy policy and the global warming just
small pieces of the increasingly complicated
and scientifically involved challenges facing
us, Thomas Jefferson’s words have never been
more true: “Democracy demands an educated
and informed electorate.” Today more than
ever, it is vital to be aware, to know the issues,
the facts, and the science behind the choices
facing our nation, and to voice your opinion
accordingly. The information is out there. In
fact it is more accessible now than ever before,
waiting just a click away.
Carbon emissions by country. Courtesy
of Stanford Kay.
The greenhouse effect underlies most of
the science behind global warming. Gases
in the atmosphere absorb the sun’s rays to
varying degrees, causing the amount of heat
retained (and therefore the temperature) to
vary based on gas concentrations.
Imagine two planets: one without an atmosphere
(like Mars) and one with an atmosphere
(like Earth). On the Mars-like planet, the sun’s
rays travel directly to the rocky surface, where
they are absorbed and then reflected directly
back into space. On the Earth-like planet,
About the Author
John Urwin is a junior Molecular Biophysics and Biochemistry major in Jonathan
Edwards College. He is a Layout Editor for Yale Scientific Magazine and has
worked in Professor Colón-Ramos’ lab studying nervous system development in C.
elegans.
Further Reading
• “Energy, Climate Change, and Our Environment,” The White House--President
Barack Obama, March 2012. http://www.whitehouse.gov/energy
22 Yale Scientific Magazine | October 2012 www.yalescientific.org
On July 20, 2012, the U. S. Food
and Drug Administration granted
accelerated approval of the drug
carfilzomib, developed by Professor Craig
Crews of Yale’s department of Molecular
Cellular, and Developmental Biology. Carfilzomib,
which goes by the name of Kyprolis,
is approved for the treatment of patients with
multiple myeloma who have received at least
two prior therapies and have demonstrated
disease progression on or within 60 days of
the completion of their last therapy. The
approval of carfilzomib represents 14 years
of dedicated research and clinical trials, as
well as another triumph of targeted therapies
in the treatment of cancer.
Multiple Myeloma
Multiple myeloma is a cancer of plasma
cells. Plasma cells are a type of white blood
cell in the bone marrow that produce antibodies.
These cells originate within the bone
marrow, but differentiate into B lymphocytes
(also known as B cells) by the time they enter
the blood. From B-cells, they make their terminal
differentiation into plasma cells. Plasma
cells are an integral part of the immune
system - they contribute to the immune
response by producing antibodies that target
foreign pathogens for destruction. Multiple
myeloma accounts for 1% of all cancers, and
10% of all cancers of the bone marrow. The
American Cancer Society estimates that over
20,000 new cases of multiple myeloma will
be diagnosed in the United States each year.
The exact cause of multiple myeloma
is unknown, but it is believed to be the
combined result of genetic defects and
environmental toxins. Symptoms of multiple
myeloma include abnormal bleeding, bone or
back pain, anemia, and numbness or weakness
in the limbs. Multiple myeloma usually
affects people between 65 and 70 years of age
and the prognosis is generally very poor: multiple
myeloma is an incurable malignancy. The
median survival time post-diagnosis is 3-4
years, although this number depends on the
stage of cancer prior to diagnosis. According
to the international staging system for
Multiple Myeloma, which is based on the on
the serum b-2 macroglobulin level and albumin
level, patients at stage I have a median
survival of 62 months (~5 years), patients at
stage II have a median survival of 44 months
(~3.5 years), and patients at stage III have a
median survival of 29 months (~2.5 years).
Proteasome inhibitors as Multiple Myeloma drugs
Cancer is a disease causing uncontrolled
growth and replication of cells. Thus, cancer
treatments seek to slow the growth of or kill
these mutant cells. The problem with many
popular cancer therapy methods like chemotherapy
and radiation treatments is that they
are unable to distinguish between cancerous
and non-cancerous cells and ultimately kill
enough normal cells to make a patient very
www.yalescientific.org October 2012 | Yale Scientific Magazine 23
MEDICINE
Craig Crews, the Lewis B. Cullman Professor
of Molecular, Cellular and Developmental
Biology and executive director of
the Yale Small Molecule Discovery Center.
Courtesy of Craig Crews.
sick. These methods essentially try to cure
the patient with a poison. The idea ofgtreating
patients with drugs that have lesser
side-effects is what makes newer, “targeted
therapies” so appealing to researchers and
physicians. The idea of targeted therapies
is that the drug can be prescribed in a way
more personalized to the patient’s specific
disease, and should be able to selectively kill
only cancer cells.
The nature of the plasma cells involved
in multiple myeloma offers a unique way to
target and kill these cells. One of the main
functions of plasma cells is to produce large
quantities of antibodies for the immune
response. To produce the antibodies, the cell
must synthesize the proteins the antibodies
are composed of, ensure that they are folded
properly, and then export them from the cell.
The bulk of the synthesis and folding of antibodies
occurs in the endoplasmic reticulum
(ER), one of the cell’s main organelles. When
proteins don’t fold properly and build up, the
stress level of the ER increases. If the ER
stress becomes too great, the ER will signal
the cell to undergo apoptosis (programmed
cell death). The proteasome can lower this
stress. In simple terms, the proteasome is akin
to a cellular garbage truck. It can break down
proteins, and thus is the cell’s main tool in getting
rid of unwanted or misfolded proteins.
However, if the action of the proteasome is
blocked or inhibited, it can no longer aid in
lowering ER stress, and these cells are more
likely to undergo apoptosis. Because cancerous
plasma cells have a very high protein
payload, they are much more susceptible to
proteasome inhibitors. Thus, at the proper
concentration, a proteasome inhibitor could
selectively kill cancerous cells (by only targeting
the sensitive, cancerous plasma cells).
A history of proteasome inhibitors
Epoxomicin, the linear peptide from
which carfilzomib is derived, was one of
the first identified proteasome inhibitors. It
was isolated as a natural product produced
by a fungus collected by a Japanese research
group, Hanada et al, at the Bristol Meyers
Squibb Research institute in Toyko in 1992.
It was originally identified as an antitumor
agent, but it was not until
seven years later that the
Crews lab determined that the
method of antitumor action
of epoxomicin was though the
inhibition of the proteasome.
The Crews lab also found
that within epoxomicin, it is
the α,β-epoxyketone pharmacophore
that is responsible
for the selective inhibition
of the proteasome. This
selective inhibition occurs
when expoxomicin forms an
unusual six-membered mor-
The system of targeting a protein to the proteasome for degradation. Ubiquitin is activated
by the E1 and E2 ligases, and added to the protein to be targeted for destruction
by the E3 ligase. The proteasome recognizes a chain of ubiquitin molecules attached to
a protein, and then will proceed to degrade that protein. Courtesy of Thomas Pollard.
24 Yale Scientific Magazine | October 2012
Multiple myeloma cells (in purple), in
a bone marrow smear from a patient
with multiple myeloma, proliferate more
quickly than normal cells. Courtesy of
Craig Crews.
pholino ring with the amino terminal catalytic
Thr-1 of the 20S proteasome. However, for
such a compound to become a pharmaceutical,
it would have to be even more specific
in its binding selectivity, and would have to
have a lower toxicity.
One of the most common drugs used currently
for the treatment of multiple myeloma
is a protease-specific inhibitor called bortezomib
(known as Velcade). However, the
drug has several undesirable side effects. One
serious problem associated with bortezomib
was the side effect of peripheral neuropathy,
a condition in which damage to the peripheral
nerves causes pain, numbness, and muscle
problems. Another problem with bortezomib
is that patients typically acquire resistance.
These flaws led to the need for a secondgeneration
proteasome inhibitor.
Drug development
For Crews, there was a great incentive to
using natural products. Said Crews, “Evolution
has selected for exquisitely potent and
selective inhibitors. These are the ideal probes
already, so the mystery was figuring out how
they work. I’m interested in the mode of
action.” Epoxomicin in particular interested
Crews. The group at Bristol Meyers Squibb
dropped the epoxomicin project because
they did not know the mechanism of its
antitumor activity. Crews, who was very
much interested in mechanism-of-action
studies, decided to pick up the study of this
www.yalescientific.org
MEDICINE
promising compound. As
Crews explained, “Epoxomicin
had already displayed
potent antitumor
activity. The fact that they
were interested in it, but
just didn’t know how it
worked. Well, the challenge
appealed to me.” He
devised a total synthesis
of the natural product,
then, through biotin tagging
experiments, the lab
was able to determine that
epoxomicin acts by binding
the 20S subunit of the
proteasome, effectively
blocking it.
The lab then proceeded
to alter the compound to
try to get a more potent
and specific binding and
inhibition..According to
Crews, “There are three
catalytic activities in the proteasome that
have different specificities. The natural
product [epoxomicin] could target two of the
three. The goal of our project was to develop
an analog of this natural product that could
be specific for just one.”
One of these derivatives, named YU101,
had potent antitumor activity, and was able
to inhibit the proteasome better and more
specifically than epoxomicin. Indeed, it was
able to inhibit the proteasome better than
bortezomib, which at the time had just been
approved for the treatment of multiple
myeloma. Furthermore, Crews speculated
that the peripheral neuropathy side effect of
bortezomib was due to ths boronateygroup
in the compound (a boron bonded to two
hydroxyl groups and a hydrocarbon), as
boronate is an unusual pharmacore. YU101
did not have a boronate moiety, and thus was
not likely to cause peripheral neuropathy.
It appeared that Crews might have a viable
drug.
Various depictions of the proteasome. (A) An electron micrograph of the proteasome.
(B-C)Computer generated space filling models of the proteasome..
Courtesy of Thomas Pollard.
change was adding a morpholine ring to the
end of compound opposite the epoxy ketone
to boost its solubility. This compound is now
known as carfilzomib.
With the support of investors, Proteolix
was able to file a New Drug Application to
start clinical trials. Carfilzomib defied the
odds, successfully making its way through
Phase I and II clinical trials. As predicted,
carfilzomib did not produce the side effect
of peripheral neuropathy. This lower toxicity
is crucial. Crews explains that “the problem
with Velcade is that the dose limiting toxicity
prevents physicians from
being able to achieve more
than about 60% of proteasome
inhibition. With
Carfilzomib, because of
the better side effect profile,
physicians can dose
to about 75% proteasome
inhibition.”
After achieving this success
in Phase I and II trials,
in 2009, Onyx Pharmaceuticals
acquired Proteolix
and advanced the compound
through a Phase IIb
trial that led to the drug’s
accelerated approval in
July. Carfilzomib is also
going through a Phase III
trial to explore its efficacy
in solid tumors.
The FDA approved
Carfilzomib after 14 years
of dedicated research and
clinical trials. It is one of the triumphs of
targeted therapies in the treatment of cancer.
More than this though, carfilzomib is a
triumph for academic labs. Pharmaceutical
companies develop the vast majority of drugs,
yet it was an academic lab that developed this
drub. When the Crews lab picked up epoxomicin
as a project, the goal was to understand its
mechanism of action, rather than to develop
a drug. However, using solid scientific methods
and elegant experiments, they were able
to capitalize on this academic endeavor and
advance the field of cancer therapy.
About the Author
Kaitlin McLean is a senior Molecular, Cellular and Developmental Biology
major from Madison, Wisconsin. She has been working in the Crews Lab since her
freshman year and will be conducting her intensive B.S. research project under the
supervision of Professor Craig Crews.
Clinical trials and FDA approval
To start the long journey towards FDA
approval, Crews started South San Franciscobased
company, Proteolix, along with Caltech
professor Raymond J. Deshaies. The company,
despite trying many different variations of
epoximicin, finally settled on YU101 as the
best option, with one slight change. The final
www.yalescientific.org
Acknowledgements
I would like to profoundly thank Professor Crews for his time, effort, and passion.
Further Reading
• Pingali, S. R., Haddad, R. Y., & Saad, A. (2012). Current concepts of clinical management
of multiple myeloma
October 2012 | Yale Scientific Magazine 25
FEATURE
BIOETHICS
Pluripotent Politics:
The Uphill Struggle for Federally Funded Stem Cell Research
By Daniel Arias
The field of stem cell research is rife with both opportunity
and controversy. Scientists, politicians, and everyday civilians have
significant stakes in these two areas. Although stem cell research
holds promises of great advancement in therapeutic medicines, it
presents challenging questions about the morality of embryonic
research. These areas of contention are evidenced, however, not
in the discussion of U.S. stem cell research itself but in discussions
of funding: should the government support grants for stem cell
research?
“We are in a strange situation in the United States [where] our
only public issue has been about funding,” says Professor Stephen
Latham, Director of the Yale Interdisciplinary Center for Bioethics.
“In contrast with other places, stem cell research has always been
legal in the U.S. The only question has been ‘Has tax money been
used to pay for it?’”
A Field is Born
Human embryonic stem cell research can largely trace its roots
to 1998. In November of that year, a professor at the University
of Wisconsin by the name of James Thomson published an article
in the journal Science that announced the creation of the first line
of human embryonic
stem cells (hESCs).
Back in 1995, Thomson
had already perfected
the techniques
to grow stem cells from
a primate in the laboratory,
a process that had
taken four years of trial
and error. Using these
same techniques, he
then tried to create the
first hESC line, using
leftover embryos from
a local in vitro fertilization
clinic. His first try was a success, leading to the creation of the
first human stem cell line ever.
Thomson’s stem cell breakthrough held much promise for medicine
because of the very nature of stem cells themselves. A stem
cell is an undifferentiated cell, capable of becoming, with limited
exception, any fetal or adult cell type in the body (a trait known
as “pluripotency”). A stem cell can become part of the skin, the
The ECS is responsible for appetite and award-seeking behavior. Courtesy of BBC Science Features
nervous system, the lungs, or any other part of the body. Naturally,
this makes stem cell researchers eager to translate the properties of
stem cells to tissue and organ transplantation, where one’s own stem
cells could be used to regrow one’s own damaged tissue.
What makes Thomson’s research all the more interesting is that it
was accomplished with private funds, free from government expense.
But Thomson had no choice but to use private dollars. In 1974,
Congress banned government funding to embryonic research. In
the aftermath of the 1973 Supreme Court decision in Roe v. Wade,
some groups felt deeply concerned about embryonic research, insisting
that Roe v. Wade would lead to unregulated use of fetal tissue
leftover from abortions. Congress acted swiftly and banned federal
funding for embryonic research altogether, allowing the ban to be
lifted only with the establishment of guidelines to regulate research.
Restriction and Resistance
“ ”
In contrast with other places, stem
cell research has always been legal
in the U.S. The only question has
been ‘Has tax money been used to
pay for it?’
Shortly after Congress enacted the ban, the scientific community
pressured the Department of Health and Human Services (HHS)
and its primary research organization, the National Institutes of
Health (NIH), to lift the Congressional ban. The wording of the ban
gave the HHS Secretary discretion to lift the ban, with the NIH pressuring
HHS in 1979 and 1986
to do so; both these efforts
failed. In fact, HHS decided
in 1987 to end the debate and
issued an outright ban of all
fetal tissue research, regardless
of the source of funding.
While researchers in the
United States received no
federal financial support for
fetal cell research, researchers
in other parts of the world
were advancing the field in
their steed. As advances in
fetal cell research progressed
internationally (as in the case of Anders Björklund’s Parkinson cell
therapy studies in Sweden), the NIH strengthened its advocacy for
embryonic research. Nevertheless, the 1988 NIH recommendation
to fund embryonic and fetal research was shot down by then
HHS Secretary, Louis Sullivan, and the moratorium persisted. The
United States would continue to lag behind the world in the fields
of embryonic and fetal research.
— Professor Stephen Latham
Director of the Yale Interdisciplinary Center for Bioethics
26 Yale Scientific Magazine | October 2012 www.yalescientific.org
BIOETHICS
FEATURE
At this point, Congress was prompted by
patient advocacy groups and attempted to act
to overturn the ban on embryonic research.
In 1990, a bill overturning the ban on human
embryonic research passed the House and
Senate but was vetoed by President George H.
W. Bush. Following the 1992 election, President
Clinton issued an executive order commanding
the HHS Secretary to lift the ban; after his order
drew heavy criticism from the pro-life movement,
Clinton reversed his position and allowed
the ban to stay in place. It would not be until the
end of his second term in office that the issue
of the ban would be revived.
The New Millennium and the Stem Cell
Détente
The first significant event towards loosened
funding restrictions occurred in 2000, when the
Clinton administration announced that the NIH
would accept applications for federally funded
embryonic stem cell research provided that the
embryos used for research were obtained from fertility clinics who
would otherwise discard them. Though the order prohibited funding
that would facilitate the destruction of an embryo, researchers could
apply for grants to work on new cell lines if the establishment of
the cell line was conducted using private funds.
Though this represented a significant milestone in the uphill
struggle for federally funded hESC research, the implementation
of this order was cut short by the 2000 presidential election. The
election of President George W. Bush created a de facto freeze on the
issue, for Bush had announced during his campaign that, if elected,
he would reverse Clinton’s policy on hESC funding. As the debates
over hESC research were reignited, Bush directed HHS to conduct a
second study on the issue while the President formulated a decision.
On August 9, 2001, President Bush announced his decision: the
Administration would be the first in U.S. history to provide federal
funds for hESC research. Rather than reverse Clinton’s policy, Bush
tailored it by attaching a caveat: federal funds could only be obtained
for research using pre-existing cell lines (pre-existing being defined
as existing before his address) which were obtained as discarded
material from fertility clinics by the informed consent of the donors.
A total of 21 cell lines would be now available for federal funding
in the United States.
With the election of President Obama (who, during his inaugural
address, pledged to “restore science to its rightful place”) came a
renewed push for expanded hESC funding. On March 9, 2009,
President Obama removed the federal funding restrictions for novel
stem cell lines, provided that the development of these stem cell
lines did not involve public funds.
While this history suggests a long and arduous path to victory for
embryonic stem cell research, it is important to consider that the
U.S. spends a relatively minor amount of funding towards human
embryonic stem cell studies. In 2011, the NIH directed $123 million
to hESC research; non-embryonic non-human stem cell research,
in the same year, received over five times that amount. As a portion
of the total 2011 NIH budget, hESC research received 0.39% of
Human embryonic stem cells. Image courtesy of Diane Krause from ESCRO
(Embryonic Stem Cell Research Oversight at Yale).
NIH funding.
To address this gap, states have established their own funds for
stem cell research. Latham explains that, during the Bush years, there
were researchers at Yale and the University of Connecticut who
took advantage of the Connecticut funding program, most receiving
small seed grants that got their research started. Yale received
about half of the Connecticut funding and continues to do so; the
state of Connecticut has spent $59 million in the last five years on
grants-in-aid for stem cell researchers.
A Course for the Future
As states attempt to supplement federal funding, the research
focus in the field of stem cells has been shifting. “We see people
shifting their interests towards induced pluripotent cells,” says
Latham. “Induced pluripotent cells are specialized cells, such as skin
cells and red blood cells, which undergo forced de-specialization.
You can generate tissue using induced pluripotent cells that come
from the person who will be receiving the tissue, so genetic matching
isn’t an issue,” explains Latham. “Furthermore, there aren’t the
ethical problems of human embryonic stem cells, since you don’t
need an embryo to make these cells.”
The shift to induced pluripotent cell research, however, does
not mean a death knell for embryonic stem cell research. “We can’t
shift completely,” says Latham. “The gold standard for pluripotency
[remains] the embryonic cell... We can’t completely stop, but there
has been a shift.”
As science and politics develop around the shifting focus of
stem cell research, undoubtedly the issue of funding will continue
to constitute a significant factor in the direction of research. Each
presidential administration in the last quarter of the century has
refocused the course of stem cell research through the power of
the purse. As November looms nearer and nearer, researchers and
patients around the world will be holding their breaths to see how
the upcoming election will attempt to answer the perpetual question:
Who pays?
www.yalescientific.org October 2012 | Yale Scientific Magazine 27
FEATURE
MICROBIOLOGY
Exploring the
Microbiome
BY TESHIKA JAYEWICKREME
On every part of the human body, from our nose to our gut, live
ten to a hundred trillion microbes. They outnumber our own cells
10 to 1, our genome 100 to 1. Indeed, these tiny living organisms,
many of which are bacteria, line every surface of the human body,
both inside and out. Yet most bacteria are far from foreign invaders,
as they have been passed down from our mothers during birth and
with us our entire lives, while we picked up other bacteria along the
way. With each diaper change, these bacteria grew and multiplied,
frantically racing to cover us with a protective blanket of good
bacteria before any of their harmful cousins took residence. From
our first cut, to our last breath, microbes live alongside our own
cells in harmony. Together, the genes these microorganisms encode
what scientists call the human microbiome.
What is the Microbiome?
The term microbiome may seem contradictory at first. When
ecologists talk about biomes, they are usually referring to expansive
regions on Earth like the tropical rainforests of South America or
the great Savannahs of Africa. In reference to size, the microbiome
is different; its organisms are microscopic. If all the microbes inside
a single human were weighed, it may amount to less than a single
rabbit. Yet what the microbiome lacks in mass, it more than makes
up for in diversity.
“What is great about the gut is that the density of bacteria is
incredibly high…higher than any other ecological location that we
know of,” explains Dr. Andrew Goodman, an associate professor
in microbial pathogenesis at Yale University. A single human gut
houses more species of microbes than species of animals in an
entire forest.
Biomes are more than just the species within them. When ecologists
venture through the frigid Arctic tundra, they are not solely
in search of samples to classify. What scientists care most about
biomes is how the organisms inside interact and adapt. Scientists
want to know why one species of fox survives better than another.
It is this competition among species that interests scientists most.
Similarly, with the microbiome, scientists are comparing variations
on the micro scale. How do these variations lead to differences
between people? “This variation matters,” says Goodman, “ and it
has consequences on health.”
Numerous modern diseases may potentially have roots in the
microbiome. For example, scientists find that the guts of obese
individuals contained more firmicutes, a common type of bacteria,
than their healthier counterparts. Using zebrafish specially raised
with and without firmicutes in their guts, researchers found that
these bacteria help increase fat absorption in the gut. What may have
initially been an adaptation to help our ancestors survive famine
may now contribute to the obesity epidemic.
The Microbiome as a Genome
Even amongst the most passionate scientists, few would ever opt
for a safari through the gut over the savannah. Diversity among
bacteria does not entail brightly colored feathers or exotic mating
rituals. All of this diversity exists instead as differences in bacterial
genomes in the A’s, T’s, G’s, and C’s found within their DNA.
28 Yale Scientific Magazine | October 2012 www.yalescientific.org
MICROBIOLOGY
FEATURE
Broadly put, the genome contains all the instructions necessary
for a cell to survive and grow. Imagine a genome as a recipe book,
with each recipe producing a new protein. While there are slight
differences in flavor between different human genomes, often dictating
differences in physical features, each bacterial genome has
entire recipes not found in the human genome. The microbiome has
trillions of bacteria and adds volumes of new recipes to the human
genome — 8 million new ones to our existing 22,000 — forming
a library of recipe books scientist call the Metagenome. What scientists
see is that the human body does not rely solely on human
genes. These extra bacterial genes help make new proteins to break
down toxins and perform other functions that would otherwise not
be possible within the human body.
Understanding the Metagenome Through the Human Microbiome
Project
Goodman describes the metagenome as a “description of the
capacity of a [bacterial] community to carry out a function.” The
metagenome is a collection of all the possible recipes, or genes, that
bacteria add to humans. While many of these genes only perform
functions for the bacteria, some provide vital functions to their
human hosts. Scientists have known about this enormous cache of
genes for quite some time, yet it was not until recently that they were
actually able to determine the sequence of a human metagenome.
In 2008, the National Institute of Health launched the Human
Microbiome Project as an attempt to sequence the microbiomes
Over 90 percent of the cells on the human body are actually
microbes. Courtesy of Dr. Andrew Goodman.
of 242 healthy adult volunteers. Ultimately, scientists wanted to
understand the variations among different humans. Do humans
host the same types of microbes or are these microbes unique to
each individual?
“It hasn’t been the case that we mostly have the same species, at
least in our guts. It’s almost the opposite. We mostly have our own
in our guts,” says Goodman. Indeed, the project leaders found that
the human microbiome was more dynamic than constant. Each individual
has his or her own set of microbes, but the surprising thing
Agar plates and tubes are used to study bacteria outside of the
gut. Courtesy of Jane Long, Yale Student.
was that despite this variation, parts of the metagenome remained
relatively the same. As one bacterial species died, another arose to
take its place, ensuring that some of the most crucial recipes are
never completely lost.
Next Steps
Sequencing the metagenome is only the first step in understanding
how humans vary. Projects like the Human Microbiome Project
provide a stepping stone for future research into the consequences
of such variation.
“It is becoming very clear that sick people have a different microbiome
than healthy people,” explains Goodman. Despite this, he
finds that one of the major obstacles of microbiome research is
the fact that the microbiome is so susceptible to other factors. “It’s
shaped by what foods you eat, it’s shaped by your own genome,
what your parents or outside forces gave you,” says Goodman.
Often, it proves too difficult to separate out the environment from
the disease.
One solution Dr. Goodman proposes is through the use of
germ-free mice. “If you can take the microbiome of a person and
transplant it to a germ-free mouse…you’ve controlled for a lot of
these questions about background variation.”
The concept of germ-free mice is relatively new in the research
community and requires specialized facilities. In Goodman’s lab,
mice are raised in a truly alien environment, one completely devoid
of any form of bacterial life. Everything they come in contact with,
from air, water, and food, is purified. All these added precautions
ensure that these mice live completely without a microbiome, and it
shows. “They are by no means healthy,” says Goodman. “While they
survive in the lab, it is unlikely that they would survive in the wild.”
However, Goodman believes that germ-free mice provide a “clean
slate” for researchers. By transferring what amounts to human fecal
matter into the mice gut, researchers are able create almost exact
replicas of diseased microbiomes. This allows researchers to study
disease on much larger scales. Instead of observing the microbiome
of a single obese human, researchers can instead observe the same
microbiome cultivated in hundreds of mice. Ultimately researchers
like Goodman hope that the added similarities these mice have to
their humans will help revolutionize biomedical research.
www.yalescientific.org October 2012 | Yale Scientific Magazine 29
FEATURE
HEALTH
The Politics of FDA Approval
BY JARED MILFRED
In October of 2010, the United States Food and Drug Administration
made a stunning admission of wrongdoing. Two years prior, the
FDA succumbed to outside pressure and cleared a knee implant that
never should have been approved. The “Menaflex” had already twice
been rejected by the FDA on grounds of causing increased risk of
injury and little to no benefit to patients. But despite vocal opposition
by medical experts, four congressmen from New Jersey convinced the
commissioner of the FDA to overrule his own agency’s scientists and
approve the Menaflex on its third approval attempt. The scientists insisted
the device should at least be tested for safety, but
the congressmen asserted safety trials were not
even necessary. Public campaign finance records
revealed that each of the New Jersey congressmen
received substantial campaign donations from
ReGen Biologics, the maker of the device. All
four claimed that their pressure on the FDA was
not influenced by the money ReGen gave them,
but this connection is difficult to deny.
Perfect objectivity may be an impossible goal;
even the commissioner of the FDA is not immune
to external pressure. But the Menaflex could never
have been wrongfully approved were it not for
one particular FDA program that allowed a single,
influenceable person to overrule the collective scientific
opinion of a federal agency and circumvent
essential safety review. This was allowed because of
the FDA’s perpetual pursuit of efficiency.
The Menaflex was green-lit through an FDA program called 510(k)
or Premarket Notification. Designed to increase the efficiency of the
approval process, 510(k) allows a medical device to forgo rigorous clinical
safety trials if it is deemed similar enough to an already-approved device.
In many cases, the 510(k) program has proven to be a substantial cost
saver. It has allowed thousands of rough equivalents to bypass expensive
safety trials. But to the 210 patients who received the erroneously
approved knee implant, the 510(k) program represents the FDA at its
worst: failing to protect American citizens under the guise of increased
efficiency.
Admittedly, for the Menaflex, unethical congressional pressure could
arguably share blame with the FDA’s quest for efficiency. But a 2011
study by Dr. Diana Zuckerman, a former Yale professor and the current
president of the National Research Center for Women & Families,
showed that wrongful approvals stemming from 510(k) may be more the
rule than the exception. Zuckerman found that of all medical devices
ever recalled by the FDA because they could seriously harm patients or
result in death, more than two-thirds had been approved through the
510(k) program.
Though Congress is currently reevaluating the merits of 510(k), the
Menaflex controversy raises important questions about the cost of
efficiency itself. Does a more efficient Food and Drug Administration
actually yield better results for consumers? Many say yes. If products are
approved more quickly, they can be brought to market sooner and begin
helping patients earlier. Biomedical companies often use this argument
to justify their demand for faster approval times. Moreover, in light of
budget realities, some argue that it is prohibitively expensive to rigorously
test every new drug and medical device.
Of all medical devices ever recalled
by the FDA because they could
seriously harm patients or result in
death, more than two-thirds had
been allowed to forgo safety trials in
the name of efficiency. Courtesy of
RRY Publications.
But a loud and clear opposition disagrees. They claim that a more
efficient FDA does not yield better results for the consumer. For an
agency that regulates over 25 percent of all consumer spending in the
United States and is tasked with ensuring the safety of every person who
purchases food or uses medicine, it is reasonable to allocate more than the
eight dollars per American that Congress budgeted for the agency in 2012.
A reasonable compromise might be to charge the companies who
seek FDA approval and use that money to pay for clinical trials. This
is precisely the approach taken by the FDA since 1992 for novel drugs
whose safety have never been tested. However,
the roughly $2 billion per year the FDA currently
receives from drug companies as “user fees” raises
the potential for conflicts of interest between
sound science and financial motives. Perhaps the
only way to prevent conflicts of interest while
ensuring the highest safety standards would be
to rigorously test every drug or medical device,
regardless of similarity to previous approved
products and, furthermore, for the federal government
to be the sole source of FDA funding. On
the other hand, such an approach is the antithesis
of efficiency and cost effectiveness.
Dr. Joseph Ross, Assistant Professor of General
Medicine at the Yale School of Medicine, is an
expert on federal medical policy and its impact
on quality of medical care. He recently published
a study in The New England Journal of Medicine
comparing the average approval time of the FDA to those of its peer
agencies, the European Medicines Agency and Health Canada. Despite
a long-standing industry claim that the FDA does not approve products
quickly enough (for lack of funding or otherwise), Dr. Ross found that
the FDA’s approval time is, on average, roughly a month and a half
faster than its peers. Speaking about the 2012 renewal of the FDA’s
authorizing legislation, Dr. Ross says, “As the new bill in Congress was
being discussed, what was coming from the industry side was ‘FDA is
still not fast enough.’ When I hear that, I think, knowing that faster is
probably sloppier, is speed really such a concern?” He continues, “For
me, the principle responsibility of the FDA is to make sure that only
safe medications are out there on the market. But there are still unsafe
drugs getting approved. If Europe were moving faster, but their safety
records were the same, maybe you could argue that the FDA should go
faster. But if the FDA is moving faster, what the industry is doing is
likely to push the FDA into making sloppy decisions.”
Many aspects of the FDA’s charter are updated only once every five
years, and just last July, the 2012 legislation was passed by Congress and
signed by President Obama. As such, a serious overhaul would likely have
to wait until 2017. But in light of recent findings, Congress may want
to do two things: (1) de-emphasize the pursuit of efficiency by either
tightening or discarding programs like 510(k) that only ostensibly increase
efficiency at the cost of safety and, (2) if ever the FDA is found slower
than its peers, reinforce the FDA overall by increasing the agency’s budget
rather than cutting corners and skipping steps of regulatory review. The
FDA may not be the hottest topic in Washington, but the regulation
of food and drugs, especially when it goes wrong, should deserve the
utmost attention of American politics.
30 Yale Scientific Magazine | October 2012
www.yalescientific.org
-
-
-
-
-
Tissue engineering: from stem cells to organs
Biodegradable scaffolds are used to “guide” cells into a proper shape
while they grow. Over time, the scaffold degrades and the cells take
a particular shape, in this case, a vascular graft. Courtesy of Sashka
Dimitrievska and Dr. Laura Niklason’s Lab
TECHNOLOGY
FEATURE
COPY, PASTE, PRINT... KIDNEY?
-
3D Printing: Organ supply
closets?
-
-
October 2012 | 31
FEATURE
HEALTH
ON THE ROAD TO SWEETNESS:
A CLEAR-CUT DESTINATION?
BY MARGARETTA MIDURA
-
-
-
the tastes.
-
-
The original tongue map depicting which areas of the tongue
sense the four primary tastes. The taste buds are made up
Courtesy of James Beard Foundation.
-
-
concentrations of certain taste
32 | October 2012
ASTRONOMY
FEATURE
Curious?
NASA Rover Curiosity Cruises onto Mars
BY KATIE COLFORD
Conceptualized by a 27-year-old graduate student, named by a
12-year-old middle school student, and immortalized in a vibrant
Twitter feed, Curiosity marks the most ambitious mission ever
flown to Mars. Equipped with a whole gamut of arguably the most
complex and innovative scientific and analytical instruments to date,
Curiosity has one main mission: to determine if microbial life could
ever have survived on the now dry and dusty planet.
Although Curiosity will be roaming Mars for the next year and
a half collecting and analyzing data, perhaps the most challenging
part of the mission was simply landing the rover safely. To put it
lightly, Curiosity is the largest rover ever sent to Mars: it weighs more
than 2,000 pounds, has the dimensions of a small car, and carries
an entire science laboratory inside of it. After more than half a year
hurtling through space, the massive rover needed more innovative
engineering than the usual airbags to cushion its landing on the red
planet. Although some called it far-fetched, the solution was “analyzed,
peer-reviewed, and tested the hell out of,” as Peter Theisinger,
project manager of Curiosity, described in a press teleconference.
Plummeting into the Martian atmosphere at about 21,000 kilometers
per hour, parachutes significantly slowed the vehicle in
what engineers called the final “seven minutes of terror” before a
descent stage detached from the rover, using steerable engines to
slow the rover even further. Finally, in the last few seconds before
landing, the crown jewel of the innovative landing strategy came
into play: A sky-crane system, connected to the rover by a tethered
rope, carefully lowered Curiosity onto the planet’s surface. Upon a
Artistic representation of Curiosity on Mars. Courtesy of
NASA.
successful touchdown, the descent stage cut away from the rover,
and Curiosity surfed safely and triumphantly toward Gale Crater.
Following a successful start to the mission on Mars, Curiosity’s
technology is just as impressive, boasting instruments that would
make any laboratory on Earth envious. In order to first collect
particles, Curiosity has the typical array of collection tools that
allow it to pick up rocks and scoop sand, with one exciting, distinctive
feature: a high-powered laser that vaporizes rocks. Called
the ChemCam, the laser can shoot its target from as far away as 23
feet and is equipped not only to let Curiosity collect the thin layers
of vaporized rock but also to identify the individual atoms of the
vapor. Including a gas chromatograph, a mass spectrometer, and
a turn-able laser spectrometer, all of these instruments essentially
serve to break down the little bits of rock and sand into single
molecules. Then, it will let scientists run a wide range of tests to
Artistic representation of Curiosity’s sky-crane landing
system. Courtesy of NASA.
analyze what Martian conditions were once like. Other instruments,
such as an X-ray diffraction and fluorescence instrument, enable
scientists to look at the minerals in rocks and solids and determine
the bulk makeup of these particles. Furthermore, the Mars Hand
Lens Imager can take extreme close-up images of the Martian
surface, capturing details smaller than the width of a human hair.
While the rover carries the most complex and innovative instruments
to date, the NASA mission is also pioneering an innovative
mission back on Earth: using social media to keep the public interested
and informed on Curiosity’s progress. As Curiosity journeyed
through space, its over 1 million Twitter followers and Facebook
fans were kept up-to-date on its progress. Upon landing on Mars,
almost 100,000 people re-tweeted by-the-minute accounts of the
rover’s progress, and an interplanetary broadcast of the song “Reach
for the Stars” by popular artist will.i.am stirred excitement in many
observers. Never before has NASA space technology been so accessible
to the general population.
Perhaps this is partially because NASA has never been so close to
being able to announce news of extra-terrestrial life. “If we found
life on Mars, even if it was very low forms of life…that would
certainly be very significant in terms of how we saw ourselves,”
says Professor Peter Parker, Director of Undergraduate Studies of
Physics at Yale University. “Psychologically, I think that is a very
important thing.”
As the winner of the Rover’s naming contest, 12-year-old Clara
Ma wrote, “Curiosity is an everlasting flame that burns in everyone’s
mind.” Indeed, with continued success, this mission could leave a
legacy that burns forever.
www.yalescientific.org October 2012 | Yale Scientific Magazine 33
FEATURE
EDUCATION
The Story of Science at Yale, Part III:
Science at Yale on the Horizon
BY DENNIS WANG
In his inaugural address in 1993, newly minted President Richard
Levin pledged his support for the sciences at Yale and highlighted
their importance to the university.
“Today, the scientific capability of American universities is the envy of the
world. We neglect its support at our peril.”
–President Levin, October 1, 1993
Nearly two decades later, Levin boasts a long list of accomplishments
in the sciences and across the university, most notably through
updates in the scientific curriculum and investments in new facilities.
He has increased research opportunities for students through the
Perspectives on Science and Engineering Program for freshmen, and
the Science, Technology and Research Scholars (STARS) Program for
minorities and women, and made a push to improve both recruitment
and retention of top science students. Levin has also worked to hire
professors preeminent in their fields and firmly establish Yale’s place
in the scientific community.
At the turn of the new millennium, Levin announced a $500 million
investment in science and engineering facilities, and an additional $500
million investment in facilities at the medical school.
“Yale is committed to remain on everyone’s short list of the best universities
in the world. In the 21st century, you must excel in science and engineering to
maintain that position.”
–President Levin, January 19, 2000
Levin planned for five science buildings in his 20-year commitment
to science and technology. Four of these original five buildings have
been built (Class of 1954 Environmental Science Center, Malone
Engineering Center, Class of 1954 Chemistry Research Building. and
Kroon Hall) in addition to two others that were not part of his original
plan (The Anlyan Center and Smilow Cancer Hospital).
Last summer, Kline Biology Tower was renovated and equipped with
the new Center for Science and Social Science Information as well
as a new café, now a popular hangout for science students. This past
summer, lecture halls in Sloane Physics Laboratory were renovated
and the Center for Engineering Innovation and Design (CEID) was
opened on Prospect Street, making science and engineering more
visible than ever with exhibitions of student work. Sterling Chemistry
Laboratory, Gibbs Laboratory, and Osborne Memorial Laboratories
have yet to be renovated.
In 2008, the proposed $500 million Undergraduate Science Center,
a plan for the vertical expansion of Sterling Chemistry Laboratory
to include a dining hall and a gym, was canceled due to the recession.
The project was replaced by a $50 million renovation of Kline
Chemistry Laboratory. The Undergraduate Science Teaching Center
is still on Yale’s wish list.
Levin announced on August 30, 2012 that he would step down at
the end of the academic year. In his email, he acknowledged that there
remains an unfinished agenda.
“Before us lie decisions about when to proceed with such projects as constructing
the Yale Biology Building, facilities for science teaching…”
–President Levin, August 30, 2012
Construction on the $250 million Yale Biology Building, the fifth
building, has been suspended. The “Giving to Yale” page is currently
soliciting a $100 million donation for the project.
Other projects have also met similar struggles. Science Park at
Yale, the old site of the Winchester Repeating Arms Company, was
supposed to become a hotspot for biotech companies and startups,
but after years of stagnation and several changes in ownership over
the last decade, has become an industrial complex featuring a garage,
apartments, and offices.
In its most recent display of long-term commitment, however, Yale
acquired 136 acres of land in 2007 for its West Campus, an area devoted
exclusively to science. The new property, formerly owned by Bayer
HealthCare Pharmaceuticals, gives science at Yale the room it needs to
grow. West Campus is, however, located 7 miles away from downtown
New Haven, raising concerns about how the space can be integrated
with the rest of campus and be made useful to undergraduates. West
Campus includes more than half a million square feet of ready-made
laboratory space and only cost $100 million. The Yale Biology Building,
in comparison, will cost $250 million for just 286,000 square
feet of laboratory space. Already, many interdisciplinary institutes
including the Yale Center for Molecular Discovery, the Yale Center
for Genomic Analysis, the High Performance Computing Center, and
the West Campus Analytical Core, are located at West Campus. It is
perhaps the best example of Yale’s emphasis on value for its money,
and much it of its potential remains untapped.
Yale University is a world leader in science education, but there is
always room for improvement. Thanks to the leadership and dedication
of President Levin and others like him, science at Yale is well on
its way to a brighter future.
“In the twenty-first century, no education will be complete without
a significant infusion of science and quantitative reasoning. The
curricular reforms now unfolding in Yale College were developed
expressly to meet the need for a scientifically literate citizenry,” Levin
recently noted.
In this spirit, a four-part series of 100-level biology courses was
introduced this year to provide science underclassmen with a “deeper
level of understanding,” in the words of Professor Michael Koelle.
Koelle, who teaches BIOL 101: “Biochemistry and Biophysics,”
believes that the courses will better prepare students for higher-level
classes and hopes that they will pave the way for better upper-level
electives. The supply of new offerings may have come just in time
to meet demand. The introductory biology courses were extremely
popular throughout shopping period, and more freshmen are also
declaring their interest in science majors. Koelle also added, “We are
definitely continuing to teach science courses for non-science majors!
Generating new high-quality offerings in this area is a priority at Yale
College.”
As we prepare for the transition in leadership, we can be confident
that if the next president of Yale University is anything like President
Levin, Yale will continue to improve its science programs, and will
always rise to meet the challenges of the future for Yale and for the
world.
34 Yale Scientific Magazine | October 2012 www.yalescientific.org
BOOK REVIEW
FEATURE
Rating:
&&&&&
The Vision Revolution
BY GRACE CAO
The eyes are one of the most amazing parts of the human body.
Our eyes can perceive the ruby tones of a pomegranate, gaze up
to the peak of Mount Everest, and follow speeding cars quickly
enough to protect us as we cross the road. Though we do not always
fully appreciate these abilities, a
moment of thought reveals how
extraordinary they are. Dr. Mark
Changizi, Assistant Professor of
Cognitive Science at Rensselaer
Polytechnic Institute, goes a step
further than simply thinking our
eyes are remarkable. In his book.
The Vision Revolution: How
the Latest Research Overturns
Everything We Thought We Knew
About Human Vision, he compares
four different aspects of
our vision to four “superpowers,”
shifting the lens for how we normally
think about the human eye.
Changizi first discusses our
ability to understand people’s
emotions through changes in
their facial color, a superpower he
refers to as color telepathy. The
need for this skill is one reason
the author argues we have color
vision at all: to easily observe the
small changes in skin color, such
as blushing, that reflect shifts in
mood. While this claim is intriguing,
Changizi’s evidence is often
heavily anecdotal. For example, he
writes that human skin is uniquely
colorless, but explains this mainly
by citing the lack of a word to
describe the color “skin” in many
languages. A full-color insert of
figures and illustrations compensates
in part for this shortcoming,
however, and keeps the section
entertaining.
In addition to color telepathy, Changizi also asserts humans have a
form of X-ray vision ability. To see this ability in action, spread out
your fingers: hold up your hand, and try looking past your hand with
one eye closed, then the other. What you see is, obviously, much less
than when both eyes work together. According to Changizi, this is
what real X-ray vision is. Overall, this chapter is stronger than the
first, continuing to include engaging asides while also referencing
scientific studies to support the argument. Therefore, when the
author claims that the advantage of seeing through clutter was the
main driver of the evolution of our binocular vision, his argument
Mark Changizi presents his theories about why our
vision works the way it does. Courtesy of tradebit.com.
feels more substantial.
The third chapter of the book deals with future-seeing, a fascinating
explanation of why optical illusions work. True to the opening
paragraph, illustrations of optical illusions are scattered throughout
the chapter, maintaining the reader’s
interest and demonstrating an effect
of future-seeing. Changizi explains
that our brains are constantly creating
a perception of the future simply
to keep up with the present moment,
because of the time it takes to process
visual input. To tie this theory back to
the entertaining illusions, he suggests
that they occur because we think the
images are dynamic. As a result, our
brains attempt to construct a future
appearance that clashes with the static
nature of the image. This section is
perhaps the most successful in balancing
entertainment with scientific thinking,
because of the natural pairing of
the optical illusions and future-seeing.
Spirit-reading, our last superpower,
centers on an aspect of vision that is
uniquely human: the ability to read.
Many of us have struggled to perfect
our nearly illegible handwriting, but
perhaps fewer have wondered why
letters look the way they do. Changizi
suggests that letters model shapes
found in nature in a way that makes
them easy for our eyes to recognize.
He found a strong correlation between
the frequency of particular junctions
in the natural world and the frequency
of use for letters with similar-shaped
junctions. For example, corners are
easy to find, and so are Ls; on the other
hand, crosses occur rarely, and so
does the letter X. As one of the most
technical chapters, the disconnect in
this section between the scientific content and slightly artificial
superpower terminology can be jarring, but the strength of the
researched ideas makes it worth reading.
Overall, though the superpower theme can occasionally feel
gimmicky, Changizi does an excellent job of writing about vision
in an accessible way. The casual tone, frequent anecdotes, and
informative illustrations make potentially complicated ideas easy
for anyone to handle. Though the more scientific reader may be
frustrated by the basic level from which he approaches his subject,
his intriguing theories nonetheless give us a fresh perspective on
how we see the world.
www.yalescientific.org
October 2012 | Yale Scientific Magazine 35
FEATURE ZOOLOGY
Laughter across the Animal Kingdom,
from Rats to Humans
BY STELLA CAO
Have you ever heard a rat laugh? Jaak Panksepp has, and he finds
nothing unusual about it. Panksepp, Professor Emeritus of Psychology
at Bowling Green State University, tickles rats in his lab to elucidate
the fundamentals of laughter.
Scientists have long known that humans are not the only species
capable of laughing. In fact, most mammals, from chimpanzees to
dogs, can laugh as well. Similar to other abilities that are shared among
many species, some believe that there must be a reason the ability to
laugh at a good joke, from tickling, or some other source is shared
among so many different species. Given its prevalence and importance
in social interactions for all of these species, scientists seek to learn
more about the origins and purpose of laughter.
Panksepp is at the forefront of such research, and his work on rat
laughter has led to some interesting and unexpected observations.
First, Panksepp clarifies that rat laughter is slightly different from
that of humans. Rat laughter comes in the form of high frequency
50-kilohertz ultrasonic calls, or “chirps,” that are distinct from other
vocal emissions in rats. In other words, one cannot hear rat laughter;
they are actually high-pitched chirps that must be measured using
sensitive and specialized equipment.
Rats laugh when tickled in sensitive areas, such as the nape of
their neck. Courtesy of BBC.
In addition to differences in frequency, rats also laugh in different
situations than most humans do. While rats laugh when tickled in
sensitive areas such as the nape of their neck, young rats also laugh
when they anticipate rewards or enter new environments. Rats also
laugh when they are nervous and when trying to diffuse aggressive
situations. These observations have led Panksepp to hypothesize that
by laughing, rats display emotional health and engage in social bonding
with other fellow rats. Therefore, rats that laugh more frequently might
have a higher social standing within a group because they attract other
rat, somewhat like the class clown in elementary school.
Laughter among children during boisterous play is similar to young
rats laughing when they are tumbling together. According to Panksepp,
laughter among human children and young rats is actually quite similar.
The main difference in humans, he notes, is that humans activate
“higher order structures” like the frontal cortex when laughing at jokes,
leading to laughter in response to multiple kinds of stimuli. On the
Panskepp studies laughter by tickling rats. Courtesy of Duke
University.
other hand, adult rats do not necessarily have the cognitive mechanisms
to understand verbal jokes and sarcasm. “The use of language-based
jokes is clearly unique to humans,” says Robin Dunbar, a professor
of evolutionary psychology at the University of Oxford. Dunbar
also claims “laughter predates the appearance of language in human
evolution and was used as a mechanism to allow bonding between a
large number of individuals.”
Laughter in humans releases endorphins, which produce the feeling
of well-being in the brain. Releasing endorphins allows for bonding
among individuals in a group, which is beneficial to the hyper-social
societies humans live in. Sharing of laughter is likely to help people
bond and facilitate closer connections. Beyond this, however, behavioral
neuroscience has yet to clearly link how these tiny chemical
changes add up to cause something to seem funny to us — or rats.
Rats laugh when tickled in sensitive areas, such as the nape of
their neck. Courtesy of BBC.
36 Yale Scientific Magazine | October 2012 www.yalescientific.org
ALUMNI PROFILE FEATURE
Jennifer Staple-Clark, B.A. ’03
BY PAYAL MARATHE
In September 2000, Jennifer Staple was a sophomore in Timothy
Dwight College interested in starting a new club. Twelve years later,
this project has grown into the global non-profit. Unite for Sight is an
internationally visible organization bringing eye care to communities
in India, Ghana, and Honduras.
Staple-Clark’s original inspiration for the organization came during
the summer of 2000. A biology and anthropology double major at
Yale, she experienced her first introduction to patient care while
working that summer at an optometrist’s office. She heard stories
from patients who came in with glaucoma, which if untreated, allows
pressure to build up in the eye, damaging the optic nerve and causing
irreversible blindness.
“These people had health insurance and went to other doctors but
just not eye doctors. They hadn’t noticed visual deterioration until it
was too late,” Staple-Clark says. She also noticed that a lot of New
Haven’s population did not have health insurance.
Volunteers work at local clinics, providing surgeries, preventitive
care, and education. Courtesy of Unite for Sight.
And so the first chapter for Unite for Sight was born. It started as a
group of 35 volunteers who made trips to the soup kitchen and New
Haven Public Library to spread knowledge about existing resources.
By the time of her college graduation three years later, Staple-Clark
decided to branch out to other university campuses. There are now
50 chapters of Unite for Sight in North America. The organization
expanded even further in 2004 when it launched its global health
delivery program in Ghana.
“I originally planned for it to be a student organization that would
work to eliminate patient barriers to eye care in New Haven. I did
not anticipate that it would become a worldwide health organization,”
Staple-Clark says.
Staple-Clark saw her work in college as a stepping stone for a
larger role in the realm of global eye care and health education. She
explains that about 80 percent of blindness is preventable or curable
by simple surgeries or care. Cataracts, for example, are a significant
source of blindness in developing nations, but the treatment is only a
15-minute outpatient surgery. Still, some communities cannot access
or afford this care.
“A lot of governments focus on HIV or malaria, known as killer
diseases, and don’t recognize eye care as a critical issue,” Staple-Clark
says. But she argues that the impact of blindness is often underestimated
in its impact in other measures of quality of life. In developing
countries, for instance, children often become caretakers for blind
adults, preventing them from attending school and thus contributing
www.yalescientific.org
Staple-Clark examines a patient in India. Since 2000, Unite for
Sight has provided eye-care services to 1.3 million people in
India, Ghana, and Honduras. Courtesy of Stanford University.
to the cycle of poverty.
With this in mind, Staple-Clark extended the global health delivery
program to India and Honduras. An annual Global Health and
Innovation Conference brings professionals from different medical
disciplines together to cross-strategize.
Additionally, Staple-Clark emphasizes the importance of research.
A core of volunteers within Unite for Sight has been studying the
“barriers to care that are impacting communities,” such as poverty
and a mother’s perception on eye care for her children. The outreach
program sends local optometrists into villages to dissipate eye care
myths such as “putting urine or breast milk in the eye” as a cure for
conditions.
She finds her work with Unite for Sight incredibly fulfilling. “I work
with remarkable local eye doctors who are collectively providing quality
care each year to more than 200,000 patients living in poverty. They
are incredibly committed and dedicated to improving health outcomes
in their countries, and I can think of nothing more rewarding than
working with them to improve lives,” she says.
Looking back, Staple-Clark reminisces how she “absolutely adored”
her time at Yale and shares some advice. “It’s so important for students
to follow their passions, not just what they are interested in but what
they could become interested in,” she says, offering her decision to
double major as an example. She only became curious about anthropology
after shopping a class on a whim. “I found the combination to be
terrific because I was able to explore cultural and medical anthropology
alongside scientific aspects of biology,” she says.
Staple-Clark’s work has not gone unnoticed on the national stage.
In 2009, she won the National Jefferson Award, regarded as the Nobel
Prize for public service. She was the 2011 recipient of the John F.
Kennedy New Frontier Award for her health advocacy and activism.
She is also a member of the President’s Council on International
Activities at Yale.
Staple-Clark plans on continuing running Unite for Sight for years
to come. She reflects, “We’ve currently reached about 1.4 million
people, but there’s a constant need in so many different locations to
bring people care. Every two years we try to add a new clinic partner,
and we’re always working to enhance our existing programs. I have
such a passion for the work we do.”
October 2012 | Yale Scientific Magazine 37
38 | October 2012
CARTOON
FEATURE
A Different Political Campaign
October 2012 | 39
Michele Dufault, Yale
College Class of 2011
(Saybrook), died in a tragic
accident on April 12, 2011.
Michele was a Physics and
Astronomy major, a strong
supporter of other women in
science, and a leader among
leaders. Michele’s senior thesis
project included the development
of novel detectors for dark
matter particles. Michele was
planning to continue work in
ocean sciences at the University
of Washington following her
graduation from Yale.
Michele was passionate about
science. Her infectious enthusiasm,
curiosity, generosity, adn
energy touched all those who
knew her. In honor of Michele’s
tremendous contributions to
Yale’s undergraduate science
commnity, we have etablished
the Michele Dugault Summer
Research Fellowship and Conference
Fund.
While it is our ultimate goal to
raise $100,000 to endow this
fund in perpetuity, in the event
we are not able to realize that
goal, we will instead create an
expendable fund that will support
activities in her name until
the funds have been expenced
(not less than 10 years).
This fund will support:
• A summer fellowship for a
Yale undergraduate woman
in the physical sciences, especially
Physics, Astronomy,
or Geology & Geophysics.
• Conferences that encourage
young women to pursue the
physical sciences, such as
the Conference on Undergraduate
Women in Physics
(held at Yale three of the
past four years).
If you wish to contribute to the
fund, please write a check payable
to Yale University, note on
the check that is is for the “Michele
Dufault Summer Fellowship
and Conference Fund” and mail
it c/o the Physics Department,
PO Box 208120, New Haven CT
06520-8120. We will transfer your
donation to the Development
Office (stewards of the fund)
promptly.
You may donate online through
Yale Development Office website
at http://giving.yale.edu. Please
select “new gift,” “other,”
when asked which area at Yale
you would like to support, and
select “Michele Dufault Fund”
when prompted.