YSM Issue 88.4

yalescientific
  • No tags were found...

Issue 88.4 of the Yale Scientific Magazine

Yale Scientific

Established in 1894

THE NATION’S OLDEST COLLEGE SCIENCE PUBLICATION

NOVEMBER 2015 VOL. 88 NO. 4

Ancient Ink

MODERN SCRIPTS

Computers master medieval texts


N e w l y R e n o v a t e d

& Under New Ownership

Visit us at

15 Broadway

& 44 Whitney

Now with

event space!

Hosting 20-30

Fresh,

Healthy,

& Affordable

Foods

Call 203-787-4533

or visit in store for details

24 hours

a day

7 days a week


5

6

6

7

7

8

9

10

11

12

25

26

NEWS

Letter From the Editor

Banned Drug Repurposed for Diabetes

ArcLight Illuminates Neurons

New Organ Preservation Unit

Surprising Soft Spot in the Lithosphere

An Unexpected Defense Against Cancer

New Device for the Visually Impaired

Energy Lessons from Hunter-Gatherers


Mosquitoes Resistant to Malaria

FEATURES

Environment

Climate Change Spikes Volcanic Activity

Cell Biology

How Radioactive Elements Enter a Cell

Yale Scientific

Established 1894

CONTENTS

NOVEMBER 2015 VOL. 88 ISSUE NO. 4

24

Ancient Ink,

Modern Scripts

Through months of a

squirrel’s cold slumber,

neurons generate

their own heat to keep

functioning. Our cover

story explains this

feat of the nervous

system and explores

what it might mean for

humans.

ART BY CHRISTINA ZHANG

Black Hole with a

13

Growth Problem

22

ON THE COVER

Ancient Ink,

Modern Scripts

A new algorithm allows computer

scientists to unlock the secrets of

medieval manuscripts. From pen to

pixel, researchers are using science

to better understand historical texts.

A supermassive black hole challenges the foundations

of astrophysics, forcing astronomers to update the

rule book of galaxy formation.

28

Robotics

Robots with Electronic Skin

30

31

32

34

35

Computer Science

Predicting Psychosis

Debunking Science

San Andreas

Technology

The Future of Electronics

Engineering

Who Lives on a Dry Surface Under the Sea?

Science or Science Fiction?

Telepathy and Mind Control

16

Tiny Proteins with

Big Functions

Contrary to common scientific belief,

proteins need not be large to have

powerful biological functions.

18

IMAGE COURTESY OF MEG URRY

East Meets West in

Cancer Treatment

A Yale professor brings an ancient remedy

to the forefront, showing that traditional

herbs can combat cancer.

36


Grey Meyer MC ‘16

37


Michele Swanson YC ‘82

38

Q&As

Do You Eat with Your Ears?

How Do Organisms Glow in the Dark?

20

Nature’s

Blueprint

ART BY CHANTHIA MA

Scientists learn lessons from nature’s greenery, modeling

the next generation of solar technology on plant cells.



November 2015


3


FEATURE

book reviews

SPOTLIGHT

SCIENCE IN THE SPOTLIGHT

HOW TO CLONE A MAMMOTH Captivates Scientists and Non-scientists Alike

BY ALEC RODRIGUEZ

Science fiction novels, TV shows, and movies have time and time

again toyed with the cloning of ancient animals. But just how close

are we to bringing these species, and our childhood fantasies, back

to life?

While animals were first cloned about 20 years ago, modern

technology has only recently made repopulating some areas of the

world with extinct species seem feasible. In her book, How to Clone a

Mammoth, evolutionary biologist Beth Shapiro attempts to separate

facts from fiction on the future of these creatures. Her research

includes work with ancient DNA, which holds the key to recreating

lost species. The book, a sort of how-to guide to cloning these

animals, takes us step-by-step through the process of de-extinction.

It is written to engage scientific and non-scientific audiences alike,

complete with fascinating stories and clear explanations.

To Shapiro, de-extinction is not only marked by birth of a cloned

or genetically modified animal, but also by the animal’s successful

integration into a suitable habitat. She envisions that researchers

could clone an extinct animal by inserting its genes into the genome

of a related species. Along these lines, Shapiro provides thoughtprovoking

insights and anecdotes related to the process of genetically

engineering mammoth characteristics into Asian elephants. She

argues that the genetic engineering and reintroduction of hybrid

animals into suitable habitats constitutes effective “clonings” of

extinct species.

BY AMY HO

Mark Steyn’s recent A Disgrace to the Profession attacks Michael E.

Mann’s hockey stick graph of global warming — a reconstruction of

Earth’s temperatures over the past millennium that depicts a sharp

uptick over the past 150 years. It is less of a book than it is a collection of

quotes from respected and accredited researchers, all disparaging Mann

as a scientist and, often, as a person.

Steyn’s main argument is that

Mann did a great disservice to science

when he used flawed data to create a

graph that “proved” his argument

about Earth’s rising temperatures.

Steyn does not deny climate change,

nor does he deny its anthropogenic

causes. His issue, as he puts it, is

with the shaft of the hockey stick,

not the blade. His outrage lies not

only in the use of poor data, but in

Mann’s deletion of data in ignoring

major historical climate shifts such

as the Little Ice Age and the Medieval

Warm Period.

IMAGE COURTESY OF AMAZON

While some sections of the book

are a bit heavy on anecdotes, most

are engaging, amusing, and relevant

enough to the overall chapter themes

to keep the book going. Shapiro

includes personal tales ranging from

asking her students which species

they would de-extinct to her struggle

trying to extract DNA from ember.

The discussion of each core topic feels

sufficient, with a wealth of examples.

Shapiro tosses in some comments

on current ecological issues here

and there, and for good measure, she busts myths like the idea

that species can be cloned from DNA “preserved” in ember. Sorry,

Jurassic Park.

The book is a quick, easy read — only about 200 pages — that would

be of interest to any biology-inclined individual and accessible even

to the biology neophyte. Shapiro summarizes technological processes

simply and with graphics for visual learners. Most of all, Shapiro’s

book leaves the reader optimistic for the future of Pleistocene Park

— a habitat suitable for the reintroduction of mammoths.

Our childhood fantasies, when backed with genetic engineering,

could be just around the corner.

A DISGRACE TO THE PROFESSION Attacks the Man Instead of the Science

To Steyn, the Intergovernmental Panel on Climate Change (IPCC)

and all those who supported the hockey stick graph also did a disservice

to science by politicizing climate change to the extent that it gives

validity to deniers. However, Steyn may be giving these doubters yet

more ammo, because he has done nothing to de-politicize the issue.

Steyn claims that Mann has drawn his battle lines wrong — but then,

so has Steyn, by attacking Mann instead of focusing on the false science.

Steyn’s writing style is broadly appealing, but his humor underestimates

his audience. His colloquial tone could be seen as a satirical take on

what Steyn refers to as Mann’s “cartoon climatology,” but it eventually

subverts his argument by driving the same points over and over while

never fully delving into scientific details. Although Steyn champions a

nuanced view of climate science, his own nuance only goes so far as to

tell his readers that they should be less certain, because meteorology and

climate science are uncertain.

“The only constant about climate is change,” Steyn points out,

advocating for us to better understand climate and to adapt to changes as

they come. It is an important point that deserves more attention than it

gets in the book. A Disgrace to the Profession is an entertaining read that

sounds like a blogger’s rant. Steyn makes few points that are especially

compelling, but then insists on hammering them in.

IMAGE COURTESY OF PRINCETON UNIV. PRESS

4 November 2015


F R O M T H E E D I T O R

Here at the Yale Scientific Magazine, we write about science because it

inspires us. Some of the biggest responsibilities in science fall to our smallest

molecules. Miniscule proteins called ubiquitin ligases are tasked with identifying

and attacking deviant cancer cells (pg. 11). Such power can be dangerous. The

simplest proteins known to exist are capable of spinning cell growth out of

control to cause tumors (pg. 16) — dangerous, yes, but still impressive.

And the researchers we interview are inspiring, in their creative approaches

to answering questions and in their dedication to making a real-world impact.

Want to know how human metabolism has changed with the modernization of

society? Find people who continue to live as hunter-gatherers for comparison

(pg. 10). Intrigued by the level of detail in medieval manuscripts? In our cover

story, scientists take on the vast medieval corpus with an innovative and efficient

computer algorithm (pg. 22). Others are extending the reach of their research

far beyond laboratory walls. A project for a Yale engineering class turned into a

new device that better preserves human organs for transplant, which became the

company Revai (pg. 7). A collaboration between a mechanical engineer in New

Haven and a theater company in London has culminated in exciting technology

that allows the visually impaired to experience their surroundings (pg. 9).

For this issue of our publication, we asked also: What inspires these scientists?

Their research questions can stem from a single curiosity in the realm of biology

or chemistry or physics. Often, they’re motivated to improve some aspect of the

world, whether it’s human health or the environment. Scientists design solutions

to achieve these improvements. For ideas, they turn to history: An ancient

Chinese herbal remedy has resurfaced as a powerful 21st century drug (pg. 18).

Or, they look to nature: Solar panels might be more effective if they were modeled

after plant cells — after all, the basic operation of both solar cells and plant cells

is to convert sunlight into useable energy (pg. 20). Even everyday electronics can

be inspired by nature — particularly, by the inherent ability of certain materials

to self-assemble (pg. 32).

Between these covers, we’ve written about a diversity of topics in science,

bringing you stories from the lab, from the field, and from the far corners of

the universe. Whether you’re fascinated by the cosmos, natural disasters, or

advanced robots, we hope you’ll see inspiration in this issue of the Yale Scientific.

Yale Scientific

Established in 1894

THE NATION’S OLDEST COLLEGE SCIENCE PUBLICATION

NOVEMBER 2015 VOL. 88 NO. 4

Ancient Ink

MODERN SCRIPTS

A B O U T T H E A R T

Computers master medieval texts

Payal Marathe

Editor-in-Chief

The cover of this issue, designed by arts editor

Christina Zhang, features the algorithm that

identifies the number of colors on digitized

medieval manuscripts. The art depicts the

process of categorizing pixels using a binary code.

Developed by Yale computer science and graphics

professor Holly Rushmeier, this technology could

help researchers decrypt medieval texts.

Editor-in-Chief

Managing Editors

News Editor

Features Editor

Articles Editor

Online Editors

Copy Editors

Special Editions Editor


M A G A Z I N E

Established in 1894

NOVEMBER 2015 VOL. 88 NO. 4

Production Managers

Layout Editors

Arts Editor

Photography Editors

Webmaster

Publisher

Operations Manager

Advertising Manager

Subscriptions Manager

Alumni Outreach Coordinator

Synapse Director

Coordinator of Contest Outreach

Science on Saturdays Coordinator

Volunteer Coordinator

Staff

Aydin Akyol

Alex Allen

Caroline Ayinon

Kevin Biju

Rosario Castañeda

Jonathan Galka

Ellie Handler

Emma Healy

Amy Ho

Newlyn Joseph

Advisory Board

Kurt Zilm, Chair

Priyamvada Natarajan

Fred Volkmar

Stanley Eisenstat

James Duncan

Stephen Stearns

Jakub Szefer

Werner Wolf

John Wettlaufer

William Summers

Scott Strobel

Robert Bazell

Ayaska Fernando

Ivan Galea

Hannah Kazis-Taylor

Danya Levy

Chanthia Ma

Cheryl Mai

Raul Monraz

Ashlyn Oakes

Archie Rajagopalan

Alec Rodriguez

Jessica Schmerler

Zach Smithline

Payal Marathe

Adam Pissaris

Nicole Tsai

Christina de Fontnouvelle

Theresa Steinmeyer

Kevin Wang

Grace Cao

Jacob Marks

Zachary Gardner

Genevieve Sertic

Julia Rothchild

Allison Cheung

Jenna DiRito

Aviva Abusch

Sofia Braunstein

Amanda Mei

Suryabrata Dutta

Christina Zhang

Katherine Lin

Stephen Le Breton

Peter Wang

Jason Young

Lionel Jin

Sonia Wang

Amanda Buckingham

Patrick Demkowicz

Kevin Hwang

Ruiyi Gao

Sarah Ludwin-Peery

Milana Bochkur Dratver

Aaron Tannenbaum

Kendrick Umstattd

Anson Wang

Julia Wei

Isabel Wolfe

Suzanne Xu

Holly Zhou

Chemistry

Astronomy

Child Study Center

Computer Science

Diagnostic Radiology

Ecology & Evolutionary Biology

Electrical Engineering

Emeritus

Geology & Geophysics

History of Science, Medicine & Public Health

Molecular Biophysics & Biochemistry

Molecular, Cellular & Developmental Biology

Undergraduate Admissions

Yale Science & Engineering Association

The Yale Scientific Magazine (YSM) is published four times a year by

Yale Scientific Publications, Inc. Third class postage paid in New Haven,

CT 06520. Non-profit postage permit number 01106 paid for May 19,

1927 under the act of August 1912. ISN:0091-287. We reserve the right

to edit any submissions, solicited or unsolicited, for publication. This

magazine is published by Yale College students, and Yale University

is not responsible for its contents. Perspectives expressed by authors

do not necessarily reflect the opinions of YSM. We retain the right to

reprint contributions, both text and graphics, in future issues as well

as a non-exclusive right to reproduce these in electronic form. The

YSM welcomes comments and feedback. Letters to the editor should

be under 200 words and should include the author’s name and contact

information. We reserve the right to edit letters before publication.

Please send questions and comments to ysm@yale.edu.


NEWS

in brief

Banned Drug Repurposed for Diabetes

By Cheryl Mai

PHOTO BY CHERYL MAI

Rachel Perry, lead author of this

study, is a postdoctoral fellow in the

Shulman lab.

The molecule behind a weight-loss pill

banned in 1938 is making a comeback.

Professor Gerald Shulman and his research

team have made strides to reintroduce 2,4

dinitrophenol (DNP), once a toxic weight-loss

molecule, as a potential new treatment for type

2 diabetes.

Patients with type 2 diabetes are insulin

resistant, which means they continue to

produce insulin naturally, but their cells

cannot respond to it. Previous research by the

Shulman group revealed that fat accumulation

in liver cells can induce insulin resistance, nonalcoholic

fatty liver disease, and ultimately

diabetes. Shulman’s team identified DNP, which

reduces liver fat content, as a possible fix.

DNP was banned because it was causing

deadly spikes in body temperature due to

mitochondrial uncoupling. This means the

energy in glucose, usually harnessed to produce

ATP, is released as heat instead. Shulman’s

recent study offers a new solution to this old

problem: CRMP, a controlled release system

which limits the backlash of DNP on the body.

CRMP is an orally administered bead of

DNP coated with polymers that promote the

slow-release of DNP. When the pace of DNP

release is well regulated, overheating is much

less likely to occur. Thus, patients could benefit

from the active ingredients in the drug without

suffering potentially fatal side effects.

So far, findings have been promising: no

toxic effects have been observed in rats with

doses up to 100 times greater than the lethal

dose of DNP.

“When giving CRMP, you can’t even pick up

a change in temperature,” Shulman said.

Results also include a decrease in liver fat

content by 65 percent in rats and a reversal of

insulin resistance. These factors could be the

key to treating diabetes.

“Given that a third of Americans are projected

to be diabetic by 2050, we are greatly in need of

agents such as this to reverse diabetes and its

downstream sequelae,” said Rachel Perry, lead

author of the study.

ArcLight Illuminates Neuronal Networks

By Archie Rajagopalan

IMAGE COURTESY OF PIXABAY

With ArcLight, real-time imaging

of neuronal networks could lead to a

major breakthrough in understanding

the brain’s many components.

Scientists have engineered a protein that will

more accurately monitor neuron firing. The

protein, called ArcLight, serves as a fluorescent

tag for genes and measures voltage changes in

real time, offering new insight on how nerve

cells operate and communicate.

Neuron firing involves the rapid influx of

calcium ions from outside of the neuron’s

membrane. Proteins that illuminate in the

presence of increased calcium levels can

therefore track the completion of an action

potential. For this reason, calcium sensors are

commonly used as a proxy for action potential

measurements. However, because calcium

changes occur more slowly than voltage

changes, calcium sensors do not provide

precise measurements of neuron signaling.

In a recent study by Yale postdoctoral fellow

Douglas Storace, ArcLight was shown to be

a more efficient candidate for this job. In the

experiment, either ArcLight or a traditional

calcium-based probe was injected into the

olfactory bulb of a mouse. Simultaneously,

using an epifluorescence microscope, Storace

observed changes in fluorescence triggered

by the mouse sniffing an odorant. Because

ArcLight reports rapid changes in the electrical

activity of neurons, Storace and his colleagues

were able to obtain more direct measurements

of neuron firing with ArcLight compared to

ordinary calcium sensors.

In addition to monitoring voltage changes

directly, ArcLight is genetically encoded and

can be targeted to specific populations of cells.

This allows scientists to monitor the electrical

activity of different cell types and may provide

more information on how different neuronal

pathways interact.

“A more accurate way of monitoring

the voltage in neurons gives us a lot more

information about their activity,” Storace said.

“Potentially, this discovery will give us enough

information about neurons to lead to a major

breakthrough.”

6 November 2015


in brief

NEWS

New Startup Develops Organ Preservation Unit

By Newlyn Joseph

An organ transplant comes with a slew of

complications. Perhaps the most commonly

overlooked problem is the preservation of

donor tissue prior to translpant. Current means

of storing intestines before they are transplanted

involve a simple container filled with ice. Until

now, there has been little progress in developing

more effective, efficient preservation strategies.

The nascent company Revai, the result of

a collaboration between the Yale Schools of

Engineering, Medicine, and Management,

addresses the challenge of preserving intestines

for transplant. Company leaders John Geibel,

Joseph Zinter, and Jesse Rich have developed

the Intestinal Preservation Unit, a device

that perfuses the intestine’s lumen and blood

supply simultaneously and independently, at a

rate determined by the surgeon. This “smart”

device collects real-time data on temperature,

perfusion time, and pump flow rates, allowing

doctors to monitor all vital storage parameters.

The technology has the potential to extend

the lifetime of intestines in between the organ

donor and the transplant recipient.

“It’s the first time we have something new for

this particular organ,” Geibel said.

Revai has demonstrated that the preservation

unit decreases the rate of necrosis, or massive

cell death, in pig intestinal tissue. This exciting

result held up when the unit was tested on

human samples through partnerships with

New England organ banks.

“We’re the only team currently presenting

peer-reviewed data on testing with human

tissue,” said CEO Jesse Rich, proud that Revai is

a frontrunner in this area of exploration.

Students in a Yale class called Medical Device

Design and Innovation built the first functional

prototype of the Intestinal Preservation Unit

for testing. The device went on to win the

2014 BMEStart competition sponsored by the

National Collegiate Inventors and Innovators

Alliance. Revai plans to continue product

development and testing for the unit, and

will seek FDA approval to commercialize the

device.

PHOTO BY HOLLY ZHOU

Joseph Zinter and Jesse Rich look at

a model of their Intestinal Preservation

Unit.

Geologists Find Surprising Softness in Lithosphere

By Danya Levy

As a student 40 years ago, Shun-ichiro

Karato learned of the physical principles

governing grain boundaries in rocks, or

the defects that occur within mineral structures.

Now, as a Yale professor, he has applied

these same concepts to a baffling geophysical

puzzle. Karato has developed a new

model to explain an unexpected decrease in

the stiffness of the lithosphere.

Earth’s outer layers of rock include the

hard lithosphere — which scientists previously

assumed to be stiff — and the softer

asthenosphere. Seismological measurements

performed across North America

over the past several years have yielded a

surprising result.

“You should expect that the velocities [of

seismological waves] would be high in the

lithosphere and low in the asthenosphere,”

Karato said. Instead, a drop was observed

in the middle of the lithosphere, indicating

softness. With the help of colleagues Tolulope

Olugboji and Jeffrey Park, Karato came

up with a new explanation for these findings.

Recalling from his studies that grain

boundaries can slide to cause elastic deformation,

Karato made observations at a

microscopic level and showed that mineral

weakening occurs at lower temperatures

than previously thought.

Even if mineral grains themselves are

strong, the grain boundaries can weaken

at temperatures slightly below their melting

point. As a result, seismic wave observations

show increased softness even while

the rock retains large-scale strength.

Karato noted that there is still work to be

done in this area. But his research is a significant

step forward in understanding the

earth’s complex layers.

“This is what I love,” he said. “Looking at

the beauty of the earth and then introducing

some physics [sometimes] solves enigmatic

problems.”

PHOTO BY DANYA LEVY

Professor Karato, who works in

the Kline Geology Laboratory building,

makes use of some of the most advanced

high-pressure equipment.


November 2015


7


NEWS

medicine

AN UNEXPECTED DEFENSE

Lupus-causing agent shows potential for cancer therapy

BY ANSON WANG

Some of the world’s deadliest diseases occur when the body

begins to betray itself. In cancer, mutated cells proliferate

and overrun normal ones. Lupus, an autoimmune disease,

occurs when the body’s immune system begins to attack its

own cells. But what if the mechanisms of one disease could

be used to counteract another?

This thought inspired recent work by James Hansen, a Yale

professor of therapeutic radiology. Hansen transformed

lupus autoantibodies — immune system proteins that target

the body’s own proteins to cause lupus — into selective

vehicles for drug delivery and cancer therapy.

His focus was 3E10, an autoantibody associated with

lupus. Hansen and his team knew 3E10 could penetrate

a cell’s nucleus, inhibiting DNA repair and sparking

symptoms of disease. What remained a mystery was the

exact mechanism by which 3E10 accomplishes nuclear

penetration, and why the autoantibody is apparently

selective for tumor cells. Unlocking these scientific secrets

opened up new possibilities to counteract disease, namely,

by protecting against cancer.

What Hansen’s team found was that 3E10’s ability to

penetrate efficiently into a cell nucleus is dependent on

the presence of DNA outside cell walls. When solutions

absent of DNA were added to cells incubated with 3E10,

no nuclear penetration occurred. With the addition of

purified DNA to the cell solution, nuclear penetration by

3E10 was induced immediately. In fact, the addition of

solutions that included DNA increased nuclear penetration

by 100 percent. The researchers went on to show that the

actions of 3E10 also rely on ENT2, a nucleoside transporter.

Once bound to DNA outside of a cell, the autoantibody can

be transported into the nucleus of any cell via the ENT2

nucleoside transporter.

“Now that we understand how [3E10] penetrates into

the nucleus of live cells in a DNA dependent manner, we

believe we have an explanation for the specific targeting of

the antibody to damaged or malignant tissues where DNA

is released by dying cells,” Hansen said.

Because there is a greater presence of extracellular DNA

released by dying cells in the vicinity of a tumor, antibody

penetration occurs at a higher rate in cancerous tissue. This

insight holds tremendous meaning for cancer therapies.

If a lupus autoantibody were coupled with an anti-cancer

drug, scientists would have a way of targeting that drug to

tissue in need. In this way, what causes one disease could be

harnessed to treat another.

The primary biological challenge for cancer therapy

is to selectively target cancer cells while leaving healthy

ones alone. The 3E10 autoantibody is a promising solution

because it offers this specificity, a direct path to the tumor

cells that will bypass all cells functioning normally. The

molecule could carry therapeutic cargo, delivering anticancer

drugs to unhealthy cells in live tissue.

The Yale researchers were pleased with their next step as

well — they showed that these engineered molecules were

in fact tumor-specific. Tissue taken from mice injected with

flourescently tagged autoantibodies showed the presence

of the antibody in tumor cells, but not normal ones after

staining.

Now, Hansen and his colleagues are looking into using

the 3E10 and their engineered molecules to kill cancer

cells. Since some cancer cells are already sensitive to DNA

damage, inhibition of DNA-repair by 3E10 alone may

be enough to kill the cell. Normal cells with intact DNA

repair mechanisms would be likely to resist these effects,

making 3E10 nontoxic to normal tissue. The researchers are

working to optimize the binding affinity of 3E10 so that it

can penetrate cells more efficiently and can exert a greater

influence on DNA repair. The goal is to conduct a clinical

trial within the next few years.

In the search for more effective drugs against cancer,

answers can emerge from the most extraordinary places.

“Our discovery that a lupus autoantibody can potentially be

used as a weapon against cancer was completely unexpected.

3E10 and other lupus antibodies continue to surprise and

impress us, and we are very optimistic about the future of

this technology,” Hansen said.

The recent study was published in the journal Scientific

Reports.

IMAGE COURTESY OF JAMES HANSEN

James Hansen (left) pictured with postdoctoral research

associate Philip Noble.

8 November 2015


technology

NEWS

NEW DEVICE FOR THE VISUALLY IMPAIRED

Collaboration yields innovative navigation technology

BY AARON TANNENBAUM

Despite its small size and simple appearance, the Animotus

is simultaneously a feat of engineering, a work of art, and a

potentially transformative community service project.

Adam Spiers, a postdoctoral researcher in Yale University’s

department of mechanical engineering, has developed a

groundbreaking navigational device for both visually impaired

and sighted pedestrians. Dubbed Animotus, the device can

wirelessly locate indoor targets and changes shape to point

its user in the right direction towards these targets. Unlike

devices that have been created for visually impaired navigation

in the past, Spiers’ device communicates with its users by way

of gradual rotations and extensions in the shape of its body.

This subtly allows the user to remain focused on his or her

surroundings. Prior iterations of this technology communicated

largely through vibrations and sound.

Spiers created Animotus in collaboration with Extant, a

visually impaired British theater production company that

specializes in inclusive performances. The device has already

been successful in Extant’s interactive production of the novel

“Flatland,” and with further research and development the

Animotus may be able to transcend the realm of theater and

dramatically change the way in which the visually impaired

experience the world.

Haptic technology, systems that make use of our sense of

touch, is most widely recognized in the vibrations of cell phones.

The potential applications of haptics, however, are far more

complex and important than mere notifications. Spiers was

drawn to the field of haptics for the implications on medical and

assistive technology. In 2010, he first collaborated with Extant to


IMAGE COURTESY OF ADAM SPIERS

Animotus has a triangle imprinted on the top of the device to

ensure that the user is holding it in the proper direction.

apply his research in haptics to theater production.

To facilitate a production of “The Question,” an immersive

theater experience set in total darkness, Spiers created a

device called the Haptic Lotus, which grew and shrunk in the

user’s hands to notify him when he was nearing an intended

destination. The device worked well, but could only alert

users when they were nearing their targets, instead of actively

directing them to specific sites. As such, the complexity of the

show was limited.

Thanks to Spiers’ newly designed Animotus, Extant’s 2015

production of “Flatland” was far more complex. Spiers and

the production team at Extant sent four audience members at

a time into the pitch-black interactive set, which was built in

an old church in London. Each of the four theatergoers was

equipped with an Animotus device to guide her through the set

and a pair of bone-conduction headphones to narrate the plot.

The Animotus successfully guided each audience member on a

unique route through the production with remarkable accuracy.

Even more impressive, Spiers reported that the average

walking speed was 1.125 meters per second, which is only 0.275

meters per second slower than typical human walking speed.

Furthermore, walking efficiency between areas of the set was

47.5 percent, which indicates that users were generally able to

reach their destinations without excessive detours.

The success of Animotus with untrained users in “Flatland” left

Spiers optimistic about future developments and applications for

his device. If connected to GPS rather than indoor navigational

targets, perhaps the device will be able to guide users outdoors

wherever they choose to go. Of course, this introduces a host of

safety hazards that did not exist in the controlled atmosphere of

“Flatland,” but Spiers believes that with some training, visually

impaired users may one day be able to confidently navigate

outdoor streets with the help of an Animotus.

Spiers is particularly encouraged by emails he has received

from members of the visually impaired community, thanking

him for his research on this subject and urging him to continue

work on this project. “It’s very rewarding to know that you’re

giving back to society, and that people care about what you’re

doing,” Spiers said.

Though the majority of Spiers’ work has been in the realm

of assistive technologies for the visually impaired, he has also

worked to develop surgical robots to allow doctors to remotely

feel tissues and organs without actually touching them.

Spiers cautions students who focus exclusively on one area

of study, as he would not have accomplished what he has with

the Animotus without an awareness of what was going on in a

variety of fields. Luckily, for budding professionals in all fields,

opportunities for collaborations akin to Spiers’ with Extant have

never been more abundant.

November 2015


9


NEWS

health

LESSONS FROM THE HADZA

Modern hunter-gatherers reveal energy use strategies

BY JONATHAN GALKA

The World Health Organization attributes obesity in

developed countries to a decrease in exercise and energy

expenditure compared to our hunter-gatherer ancestors, who

led active lifestyles. In recent research, Yale professor Brian

Wood examined total energy expenditure and metabolism in

the Hadza population of northern Tanzania — a society of

modern hunter-gatherers.

The Hadza people continue traditional tactics of hunting

and gathering. Every day, they walk long distances to

forage, collect water and wood, and visit neighboring

groups. Individuals remain active well into middle age. Few

populations today continue to live an authentic huntergatherer

lifestyle. This made the Hadza the perfect group

for Wood and his team to research total energy expenditure,

or the number of calories the body burns per day, adjusted

for individuals who lead sedentary, moderate intensity, or

strenuous lives. This total energy expenditure is a vital metric

used to determine how much energy intake a person needs.

The researchers examined the effects that body mass, fatfree

mass, sex, and age have on total energy expenditure.

They then investigated the effects of physical activity and

daily workload. Finally, they looked at urinary biomarkers

of metabolic stress, which reflect the amount of energy the

body needs to maintain normal function.

Wood was shocked by the results he saw. Conventional

public health wisdom associates total energy expenditure

with physical activity, and thus blames lower exercise rates

for the western obesity epidemic. But his study found that

fat-free mass was the strongest predictor of total energy

expenditure. Yes, the Hadza people engage in more physical

activity per day than their western counterparts, but when

the team controlled for body size, there was no difference

in the average daily energy expenditure between the two

groups. “Neither sex nor any measure of physical activity or

workload was correlated with total energy expenditure in

analyses for fat-free mass,” Wood said.

Moreover, despite their similar total energy expenditure,

Hadza people showed higher levels of metabolic stress

compared to people in western societies today. The overall

suggestion that this data seemed to be making was that

there is more to the obesity story than a decline in physical

exercise. Wood and his colleagues have come up with an

alternative explanation.

“Adults with high levels of physical activity may adapt by

reducing energy allocation to other physical activity,” Wood

said.

It would make sense, then, that total energy expenditure

is similar across wildly different lifestyles — people who

participate in strenuous activity every day reorganize their

energy expenditure so that their total calories burned stays

in check.

To account for the higher levels of metabolic stressors

in Hadza people, Wood and his research team suggested

high rates of heavy sun exposure, tobacco use, exposure to

smoke from cooking fires, and vigorous physical activity, all

characteristic of the average Hadza adult.

Daily energy requirements and measurements of physical

activity in Hadza adults demonstrate incongruence with

current accepted models of total energy expenditure: despite

their high levels of daily activity, Hadza people show no

evidence of a greater total energy expenditure relative to

western populations.

Wood said that further work is needed in order to

determine if this phenomenon is common, particularly

among other traditional hunter-gatherers.

“Individuals may adapt to increased workloads to keep

energy requirements in check,” he said, adding that these

adaptations would have consequences for accepted models

of energy expenditure. “Particularly, estimating total energy

expenditure should be based more heavily on body size and

composition and less heavily on activity level.”

Collaborators on this research project included Herman

Pontzer of Hunter College and David Raichlen of the

University of Arizona.

IMAGE COURTESY OF BRIAN WOOD

Three hunter-gatherers who were subjects of Wood’s study

stand overlooking the plains of Tanzania, home to the Hadza

population.

10 November 2015


cell biology

NEWS

THE PROTEIN EXTERMINATORS

PROTACs offer alternative to current drug treatments

BY KEVIN BIJU

IMAGE COURTESY OF YALE UNIVERSITY

Craig Crews, Yale professor of chemistry, has developed a

variation on a class of proteins called PROTACs, which destroy

rogue proteins within cancerous cells. Crews has also founded

a company to bring his treatment idea closer to industry.

Your house is infested with flies. The exterminators try

their best to eliminate the problem, but they possess terribly

bad eyesight. If you had the chance to give eyeglasses to the

exterminators, wouldn’t you?

In some ways, cancer is similar to this insect quandary.

A cancerous cell often becomes infested with a host of

aberrant proteins. The cell’s exterminators, proteins called

E3 ubiquitin ligases, then attempt to destroy these harmful

variants, but they cannot properly identify the malevolent

proteins. The unfortunate result: both beneficial and

harmful proteins are destroyed.

How can we give eyeglasses to the E3 ubiquitin ligases?

Craig Crews, professor of chemistry at Yale University, has

found a promising solution.

According to the National Cancer Institute, some 14

percent of men develop prostate cancer during their lifetime.

This common cancer has been linked to overexpression and

mutation of a protein called the androgen receptor (AR).

Consequently, prostate cancer research focuses on reducing

AR levels. However, current inhibitory drugs are not specific

enough and may end up blocking the wrong protein.

Crews and his team have discovered an alternative. By

using PROTACs (proteolysis targeting chimeras), they have

been able to reduce AR expression levels by more than 90

percent.

“We’re hijacking E3 ubiquitin ligases to do our work,”

Crews said.

PROTACs are heterobifunctional molecules: one end

binds to AR, the bad protein, and the other end binds to

the E3 ligase, the exterminator. PROTACs use the cell’s own

quality-control machinery to destroy the harmful protein.

Crews added that PROTACs are especially promising

because they are unlikely to be needed in large doses. “The

exciting implication is we only need a small amount of the

drug to clear out the entire rogue protein population,” he

said. A lower required dose could lessen the risk of negative

side effects that accompany any medication. It could also

mean that purchasing the drug is economcally feasible for

more people.

To put his innovative research into action, Crews founded

the pharmaceutical company Arvinas. Arvinas and the

Crews Lab collaborate and research the exciting potential

of PROTACs in treating cancer. PROTACs have been

designed to target proteins associated with both colon and

breast cancer.

In addition to researching PROTACs, Crews has

unearthed other techniques to exterminate proteins.

“What I wanted to do is take a protein [AR] and add a

little ‘grease’ to the outside and engage the cell’s degradation

mechanism,” Crews said. This grease technique is called

hydrophobic tagging and is highly similar to PROTACs

in that it engages the cell’s own degradation machine to

remove the harmful protein.

Having been given eyeglasses to the E3 ligases, Crews is

looking for new ways to optimize his technique.

“My lab is still trying to fully explore what is possible with

this technology,” he said. “It’s a fun place to be.”

IMAGE COURTESY OF WIKIPEDIA

Enzymes work with ubiquitin ligases to degrade aberrant

proteins in cells.


November 2015


11


NEWS

immunology

MOSQUITOES RESISTANT TO MALARIA

Scientists investigate immune response in A. gambiae

BY JULIA WEI

Anopheles gambiae is professor Richard Baxter’s insect of

interest, and it is easy to see why: The mosquito species found

in sub-Saharan Africa excels at transmitting malaria, one of

the deadliest infectious diseases. “[Malaria] is a scourge of the

developing world,” said Baxter, a professor in Yale’s chemistry

department. Discovering a cure for malaria starts with

understanding its most potent carrier.

This is one research focus of the Baxter lab, where scientists

are probing the immune system of A. gambiae mosquitoes

for answers. Despite being ferocious in their transmission of

malaria to human populations, these insects show a remarkable

immunity against the disease themselves. With ongoing research

and inquiry, scientists could one day harness the immune power

of these mosquitoes to solve a global health crisis — rampant

malaria in developing countries.

The story of Baxter’s work actually starts with a 2006 study, a

pioneering collaboration led by professor Kenneth Vernick at

the University of Minnesota. Vernick and his team collected and

analyzed samples of this killer bug in Mali. The researchers were

surprised by what they found. Not only did offspring infected with

malaria-positive blood carry varying numbers of Plasmodium,

the parasite responsible for transmitting malaria, but a shocking

22 percent of the mosquitos sampled carried no parasite at all.

Since then, scientists have turned their attention to the complex

interplay between malaria parasites and A. gambiae’s immune

system. Vernick’s group correlated the mosquitos’ genomes with

their degree of parasite infection, and identified the gene cluster

APL1 as a significant factor in the insect’s ability to muster an

immune response.

Now, nearly a decade following Vernick’s research in Mali,

A. gambiae’s immune mechanism is better understood. Three

proteins are key players in the hypothesized immune chain of

response: APL1, TEP1, and LRIM1. TEP1 binds directly to

malaria parasites, which are then destroyed by the mosquito’s

immune system. Of course the molecule cannot complete the job

alone. TEP1 only works in combatting infection when a complex

of LRIM1 and APL1 is present in the mosquito’s blood and is

available as another line of defense.

To complicate matters, this chain of response is a mere outline

for the full complex mechanism. Gene cluster APL1 codes for

three homologous proteins named APL1A, APL1B, and APL1C.

According to Baxter, this family of proteins may serve as “a

molecular scaffold” in the immune response. Though they all

belong to the APL1 family, each individual protein may serve a

distinct purpose within A. gambiae’s immune system. Herein lies

one of Baxter’s goals — uncover the functions and mechanisms of

the individual proteins.

Prior research has elucidated the role of protein C in this

family. Scientists have observed that LRIM1 forms a complex

with APL1C, and this complex then factors in to the immune

response for the mosquito. How proteins A and B contribute to

the immune response was poorly understood.

In Baxter’s lab, confirming whether LRIM1 forms similar

complexes with APL1A and APL1B posed a challenge, namely

because both proteins are unstable. Through trial and error,

Baxter’s team found that LRIM1 does indeed form complexes

with APL1A and APL1B. The scientists also observed modest

binding to TEP1, the protein that attaches itself directly to the

malaria parasite. This finding could further explain how the

mosquito’s immune system is able to put up such a strong shield

against malaria.

Within the APL1 family, the APL1B protein still presents the

most unanswered questions. Previous studies have shown that

APL1A expression leads to phenotypes against human malaria,

and APL1C to phenotypes against rodent malaria. The role of

APL1B remains cloudy. “Being contrarian people, we decided

to look at the structure of APL1B because it’s the odd one out,”

Baxter said.

His lab discovered that purified proteins APLIA and APL1C

remain monomers in solution, while APL1B becomes a

homodimer, two identical molecules linked together. Brady

Summers, a Yale graduate student, went on to determine the

crystal structure of APL1B.

This focus on tiny molecules is all motivated by the overarching,

large-scale issue of malaria around the globe. The more

information that Baxter and other scientists can learn in the lab,

the closer doctors will be to reducing the worldwide burden of

malaria.

“Vast amounts of money are spent on malaria control, but

our methods and approaches have not changed a lot and are

susceptible to resistance by both the parasite and the mosquito

vector,” Baxter said. A better understanding of A. gambiae in

the lab is the first step towards developing innovative, effective

measures against malaria in medical practice.

IMAGE COURTESY OF RICHARD BAXTER

The APL1B protein, here in a homodimer, remains elusive.

12 November 2015


BLACK HOLE

WITH A GROWTH

PROBLEM

a supermassive black hole

challenges foundations

of modern astrophysics

by Jessica Schmerler

art by Ashlyn Oakes


A long, long time

ago in a galaxy far,

far away…



the largest black holes discovered to

date was formed. While working on a

project to map out ancient moderate-


of international researchers stumbled

across an unusual supermassive black

hole (SMBH). This group included


Munson professor of astrophysics.


was surprised to learn that certain

qualities of the black hole seem to

challenge widely accepted theories

about the formation of galaxies.

The astrophysical theory of co-evolution

suggests that galaxies pre-date their

black holes. But certain characteristics

of the supermassive black hole located in

CID-947 do not fit this timeline. As Urry

put it: “If this object is representative [of the

evolution of galaxies], it shows that black

holes grow before their galaxies — backwards

from what the standard picture says.”

The researchers published their remarkable

findings in July in the journal Science. Not only

was this an important paper for astrophysicists

everywhere, but it also reinforced the mysterious

nature of black holes. Much remains unknown

about galaxies, black holes, and their place

in the history of the universe — current theories

may not be sufficient to explain new observations.

The ordinary and the supermassive

Contrary to what their name might suggest,

black holes are not giant expanses of empty space.

They are physical objects that create a gravitational

field so strong that nothing, not even light, can escape.

As explained by Einstein’s theory of relativity,

black holes can bend the fabric of space and time.

An ordinary black hole forms when a star reaches

the end of its life cycle and collapses inward — this

sparks a burst of growth for the black hole as it absorbs

surrounding masses. Supermassive black holes,

on the other hand, are too large to be formed by a single

star alone. There are two prevailing theories regarding

their formation: They form when two black

holes combine during a merging of galaxies, or they are

generated from a cluster of stars in the early universe.

If black holes trap light, the logical question that follows is

how astrophysicists can even find them. The answer: they find

them indirectly. Black holes are so massive that light around

them behaves in characteristic, detectable ways. When orbiting

masses are caught in the black hole’s gravitational field,

they accelerate so rapidly that they emit large amounts of radiation

— mainly X-ray flares — that can be detected by special

telescopes. This radiation appears as a large, luminous circle

known as the accretion disc around the center of the black hole.

Active galactic nuclei are black holes that are actively forming,

and they show a high concentration of circulating mass in their

accretion discs, which in turn emits high concentrations of light.

The faster a black hole is growing, the brighter the accretion disc.

With this principle in mind, astrophysicists can collect relevant information

about black holes, such as size and speed of formation.

The theory of co-evolution

Nearly all known galaxies have at their center a moderate to supermassive

black hole. In the mid-1990s, researchers began to notice

that these central black holes tended to relate to the size and shape of


astronomy

FOCUS

their host galaxies. Astrophysicists proposed

that galaxies and supermassive

black holes evolve together, each one

determining the size of the other. This

idea, known as the theory of co-evolution,

became widely accepted in 2003.

In an attempt to explain the underlying

process, theoretical physicists have proposed

that there are three distinct phases of

co-evolution: the “Starburst Period,” when

the galaxy expands, the “SMBH Prime

Period,” when the black hole expands,

and the “Quiescent Elliptical Galaxy,”

when the masses of both the galaxy and

the black hole stabilize and stop growing.

The supermassive black hole at the center

of the galaxy CID-947 weighs in at seven

billion solar masses — seven billion times

the size of our sun (which, for reference, has

a radius 109 times that of the earth). Apart

from its enormous size, what makes this

black hole so remarkable are the size and

character of the galaxy that surrounds it.

Current data from surveys observing

galaxies across the universe indicates that

the total mass distributes between the

black hole and the galaxy in an approximate

ratio of 2:1,000, called the Magorrian

relation. A typical supermassive black

hole is about 0.2 percent of the mass of its

host galaxy. Based on this mathematical

relationship, the theory of co-evolution

predicts that the CID-947 galaxy would

be one of the largest galaxies ever discovered,

but in reality CID-947 is quite ordinary

in size. This system does not come

close to conforming to the Magorrian relation,

as the central black hole is a startling

10 percent of the mass of the galaxy.

An additional piece of the theory of

co-evolution is that the growth of the

supermassive black hole prevents star

formation. Stars form when extremely

cold gas clusters together, and the resultant

high pressure causes an outward

explosion, or a supernova. But when

a supermassive black hole is growing,

the energy from radiation creates a tremendous

amount of heat — something

that should interrupt star formation.

Here, CID-947 once again defies expectations.

Despite its extraordinary size, the

supermassive black hole did not curtail the

creation of new stars. Astrophysicists clearly

observed radiation signatures consistent

with star formation in the spectra captured

IMAGE COURTESY OF WIKIMEDIA COMMONS

The W.M. Keck Observatory rests at the

summit of Mauna Kea in Hawaii.

at the W.M. Keck Observatory in Hawaii.

The discovery of the CID-947 supermassive

black hole calls into question

the foundations of the theory of co-evolution.

That stars are still forming indicates

that the galaxy is still growing,

which means CID-947 could eventually

reach a size in accordance with the Magorrian

relation. Even so, the evolution of

this galaxy still contradicts the theory of

co-evolution, which states that the growth

of the galaxy precedes and therefore dictates

the growth of its central black hole.

New frontiers in astrophysics

Astrophysicists are used to looking

back in time. The images in a telescope

are formed by light emitted long before

the moment of observation, which means

the observer views events that occurred

millions or even billions of years in the

past. To see galaxies as they were at early

epochs, you would have to observe them

at great distances, since the light we see

today has traveled billions of years, and

was thus emitted billions of years ago.

The team of astrophysicists that discovered

the CID-947 black hole was observing

for two nights at the Keck telescope in order

to measure supermassive black holes in active

galaxies as they existed some 12 billion

years ago. The researchers did not expect

to find large black holes, which are rare

for this distance in space and time. Where

they do exist, they usually belong to active

galactic nuclei that are extremely bright.

But of course the observers noticed the

CID-947 supermassive black hole, which

is comparable in size to the largest black

holes in the local universe. Its low luminosity

indicates that it is growing quite slowly.

“Most black holes grow with low accretion

rates such that they gain mass slowly.

To have this big a black hole this early

in the universe means it had to grow very

rapidly at some earlier point,” Urry said.

In fact, if it had been growing at the observed

rate, a black hole this size would

have to be older than the universe itself.

What do these contradictions mean for

the field of astrophysics? Urry and her

colleagues suggest that if CID-947 does in

fact grow to meet the Magorrian relation

relative to its supermassive black hole, this

ancient galaxy could be a model for the

precursors of some of the most massive

galaxies in the universe, such as NGC 1277

of the Perseus constellation. Moreover, this

research opens doors to better understanding

black holes, galaxies, and the universe.

ABOUT THE AUTHOR

JESSICA SCHMERLER

JESSICA SCHMERLER is a junior in Jonathan Edwards College majoring

in molecular, cellular, and developmental biology. She is a member of the


magazine, and contributes to several on-campus publications.


enthusiasm about this fascinating discovery.



Cambridge University Press, 2004.


November 2015


15


FOCUS

biotechnology

By Emma Healy • Art by Christina Zhang

What constitutes a protein? At

first, the answer seems simple

to anyone with a background

in basic biology. Amino acids join together

into chains that fold into the unique

three-dimensional structures we call proteins.

Size matters in proteomics, the scientific

study of proteins. These molecules

are typically complex, comprised of hundreds,

if not thousands, of amino acids.

A protein with demonstrated biological

function usually contains no fewer than

300 amino acids. But findings from a recent

study conducted at the Yale School of

Medicine are challenging the notion that

proteins need to be long chains in order

to serve biological roles. Small size might

not be an end-all for proteins.

The recent research was headed by the

laboratory of Yale genetics professor Daniel

DiMaio. First author Erin Heim, a PhD

student in the lab, and her colleagues conducted

a genetic screen to isolate a set of

functional proteins with the most minimal

set of amino acids ever described.

The chains are short and simple, yet they

exert power over cell growth and tumor

formation. Few scientists would have predicted

that such simple molecules could

have such huge implications for oncology,

and for our basic understanding of proteins

and amino acids.

Engineering the world’s simplest

proteins

There are 20 commonly cited amino acids,

and their order in a chain determines

the structure and function of the resulting

protein. Most proteins consist of many different

amino acids. In contrast, the proteins

identified in this study, aptly named

LIL proteins, were made up entirely of two

amino acids: leucine and isoleucine.

Both of these amino acids are hydrophobic,

meaning they fear water. The scientists

at DiMaio’s lab were deliberately searching

for hydrophobic qualities in proteins. An

entirely hydrophobic protein is limited in

where it can be located within the cell and

what shapes it can assume. To maintain a

safe distance from water, a hydrophobic

protein would situate itself in the interior

of a cell membrane, protected on both

sides by equally water-fearing molecules

called lipids. Moreover, the hydrophobic

property reduces protein complexity by

limiting the potential for interactions between

the polar side chains of hydrophilic,

or water-loving, amino acids. These polar

side chains are prone to electron shuffling

and other modifications, adding considerable

complexity to the protein’s function.

Heim and her group wanted to keep

things simple — a protein that is completely

hydrophobic is more predictable, and is

thus easier to investigate as a research focus.

“It’s rare that a protein is composed

entirely of hydrophobic amino acids,” said

Ross Federman, another PhD student in

the DiMaio lab and another author on the

recent paper.

The LIL proteins were rare and incredibly

valuable. “[Using these proteins] takes

away most of the complication by knowing

where they are and what they look like,”

Heim said. In terms of both chemical reactivity

and amino acid composition, she

said the LIL proteins truly are the simplest

to be engineered to have a biological function.

Small proteins, big functions

What was the consequential biological

function? Through their research, the

scientists were able to link their tiny LIL

proteins to cell growth, proliferation, and

cancer.

The team started with a library of more

than three million random LIL sequences

and incorporated them into retroviruses,

or viruses that infect by embedding their

viral DNA into the host cell’s DNA. “We

manipulate viruses to do our dirty work,

essentially,” Heim said. “One or two viruses

will get into every single cell, integrate

into the cell’s DNA, and the cell will make

that protein.”

16 November 2015


proteomics

FOCUS

As cells with embedded viral DNA started

to produce different proteins, the researchers

watched for biological functions.

In the end, they found a total of 11

functional LIL proteins, all able to activate

cell growth.

Of course this sounds like a good thing,

but uncontrolled cell growth can cause a

proliferation of cancerous cells and tumors.

The LIL proteins in this study affected

cell growth by interacting with the

receptor for platelet-derived growth factor

beta, or PDGFβ. This protein is involved

in the processes of cell proliferation, maturation,

and movement. When the PDG-

Fβ receptor gene is mutated, the protein’s

involvement in cell growth is derailed, resulting

in uncontrolled replication and tumor

formation. By activating the PDGFβ

receptor, the LIL proteins in this study

grant cells independence from growth factor,

meaning they can multiply freely and

can potentially transform into cancerous

cells.

While this particular study engineered

proteins that activated PDGFβ, Heim said

that other work in the lab has turned similar

proteins into inhibitors of the cancer-causing

receptor. By finding proteins

to block activation of PDGFβ, it may be

possible to devise a new method against

one origin of cancer. Even though the biological

function in their most recent paper

was malignant, Heim and her group are

hopeful that these LIL proteins can also be

applied to solve problems in genetics.

Reevaluating perceptions of a protein

No other protein is known to exist with

sequences as simple as those within the

LIL molecules. Other mini-proteins have

been discovered, but none on record have

been documented to display biological activity.

For example, Trp-cage was previously

identified as the smallest mini-protein

in existence, recognized for its ability to

spontaneously fold into a globular structure.

Experiments on this molecule have

been designed to improve understanding

of protein folding dynamics. While Trpcage

and similar mini-proteins serve an

important purpose in research, they do

not measure up to LIL proteins with regard

to biological function.

The recent study at the DiMaio lab pursued

a question beyond basic, conceptual

science: The team looked at the biological


function of small proteins, not just their

physical characteristics.

The discovery of LIL molecules and the

role they can play has significant implications

for the way scientists think about

proteins. In proteomics, researchers do

not usually expect to find proteins with

extraordinarily short or simple sequences.

For this reason, these sequences tend

to be overlooked or ignored during genome

scans. “This paper shows that both

[short and simple proteins] might actually

be really important, so when somebody is

scanning the genome and cutting out all

of those possibilities, they’re losing a lot,”

Heim said.

Additionally, by limiting the amino acid

diversity of these proteins, researchers

were able to better understand the underlying

mechanisms of amino acid variation.

“If you want to gain insight into the heart

of some mechanism, the more you can isolate

variables, the better your results will

be,” Federman said.

This is especially true for proteins. These

molecules are highly complex, possessing

different energetic stabilities, varying conformations,

and the potential for substantial

differences in amino acid sequence. By

studying LIL proteins, researchers at the

DiMaio lab were able to isolate the effects

of specific amino acid changes at the molecular

level. This is critical information

for protein engineers, who tend to view

most hydrophobic amino acids similarly.

This study contradicted that notion: “Leucine

and isoleucine have very distinct activities,”

Heim said. “Even when two amino

acids look alike, they can actually have

very dissimilar biology.”

Daniel DiMaio is a professor of genetics at

the Yale School of Medicine.

Another ongoing project at the lab involves

screening preexisting cancer databases

in search of short-sequence proteins.

According to Heim, it is possible that scientists

will eventually find naturally occurring

cancers containing similar structures

to the LIL proteins isolated in this

study. This continuing study would further

elucidate the cancer-causing potential

of tiny LIL molecules.

To take their recent work to the next

step, researchers in this group are looking

to create proteins with functions that did

not arise by evolution. The ability to build

proteins with entirely new functions is an

exciting and promising prospect. It presents

an entirely new way of approaching

protein research. The extent of insight into

proteins is no longer bound by the trajectory

of molecular evolution. Instead, scientific

knowledge of proteins is being expanded

daily in the hands of researchers

like Heim and Federman.

November 2015

IMAGE COURTESY OF YALE SCHOOL OF MEDICINE

ABOUT THE AUTHOR

EMMA HEALY

EMMA HEALY is a sophomore in Ezra Stiles college and a prospective

molecular, cellular, and developmental biology major.

THE AUTHOR WOULD LIKE TO THANK the staff at the DiMaio laboratory,

with a special thanks to Erin Heim and Ross Federman for their time and

enthusiasm.

FURTHER READING

Cammett, T. J., Jun, S. J., Cohen, E. B., Barrera, F. N., Engelman, D. M., &

DiMaio, D. (2010). Construction and genetic selection of small transmembrane

proteins that activate the human erythropoietin receptor. Proceedings of the

National Academy of Sciences, 107(8), 3447-3452.


17


FOCUS biotechnology

EAST MEETS WEST IN CANCER TREATMENT

Ancient herbal remedies prove their worth in clinical trials

By Milana Bochkur Dratver

Art By Alex Allen

Does grandma’s chicken soup really

chase a cold away? Do cayenne and

lemon really scare away a sore throat?

Some doctors and scientists are skeptical of

old wives’ tale remedies, and some of this

skepticism is justified. But Yung-Chi Cheng

advocates for an open-minded approach to

medical treatment. The end goal, after all, is

to give patients the best care possible, and

that means no potential treatment should be

overlooked.

Had Cheng been close-minded to eastern

medicine, ignoring remedies from ancient

China, he would not have come across a new

drug with exciting promise for cancer therapy.

With PHY906 — a four-herb recipe that for

2,000 years treated diarrhea, nausea, and

vomiting in China — Cheng is transforming

the paradigm of cancer treatment. Cancer is

a relatively modern medical condition, but

cancer treatments could use some support

from traditional recipes.

Cheng is a Henry Bronson professor of

pharmacology at the Yale School of Medicine.

Members of his lab have standardized the

PHY906 concoction, emphasizing good

manufacturing practice to circumvent some

of the criticisms of traditional herbal remedies.

In 2003, Cheng started the Consortium

for the Globalization of Chinese Medicine,

linking 147 institutions and 18 industrial

companies. “It is the biggest non-political

nonprofit with no bias or discrimination in

promoting traditional medicine,” Cheng said.

The story of Cheng’s career showcases this

belief in learning from some of the earliest

approaches to fixing human ailments.

PHY906 is currently awaiting FDA approval,

but researchers at multiple institutions across

the country are collecting data to support

its effectiveness. Results so far have been

promising: this herbal remedy seems not only

to diminish the nasty side effects of cancer

treatments, but enhances the efficiency of the

treatments as well.

Passing tests with flying colors

PHY906 is based on the historic Huang

Qin Tang formula, pronounced “hwong chin

tong.” The four ingredients are Chinese peony,

Chinese jujube, baikal skullcap, and Chinese

licorice. Some of the strongest evidence has

come out of studies in mouse models, which

show that all four ingredients are necessary in

order for PHY906 to be maximally effective.

When any one ingredient was left out, the

recipe was not as successful in treating mice

with liver cancer.

Even without the tools of modern scientific

research at their disposal, healers in ancient

China devised a scientifically sound solution

for a health problem. Cheng and his colleagues

support the examining of many different types

of historical texts for cultural remedies. Despite

scientific advancement, researchers can still

learn from the past.

Still, scientists want more than an interesting

idea grounded in history and culture — they

expect reproducible and quantifiable results.

PHY906 has so far measured up to the stringent

standards of modern research. The experiment

in mice used PHY906 in combination with

Sorafenib, the only FDA-approved drug for

the treatment of this specific liver cancer. Not

only did the addition of PHY906 decrease

unwanted side effects, but it also enhanced the

efficacy of Sorafenib.

Ongoing research involves deciphering if

PHY906 produces similar results in treating

other cancers and in different treatment

combinations. The drug is proceeding through

several levels of clinical trials as it seeks

FDA approval for human use. This involves

testing the compound in combination with

other conventional cancer therapies, such as

radiation treatment.

The ultimate goal is to bring PHY906 to

the U.S. as a prescription drug to supplement

chemotherapy, which is notorious for its

terrible side effects.

Early inspiration

Cheng has long been interested in side

effects, and in ways to eliminate them. His

independent scientific career began in 1974,

when he investigated viral-specified replication

systems. At the time, few people thought there

would be a way to selectively target viruses using

specialized compounds, but this soon became a

reality. Using the herpes virus as a model system,

Cheng found that virus-specific proteins could

in fact be susceptible to healing agents. That is,

you could introduce a compound that targets

viruses without harming other cells in its wake.

Less than a decade later, Cheng discovered

a compound to combat cytomegalovirus, a

major cause of infant mortality. The same

compound worked to treat people infected

with HIV in the 1980s. This breakthrough

in treatments for viral diseases motivated

many scientists to search for drug-based

treatments for a variety of health conditions,

including cancer. (In fact, this was the point

when medicine saw the early development

of cancer drugs.) The new drug compounds

were effective in targeting diseases, but they

simultaneously caused detrimental side effects.

The question that followed was how to

eliminate negative side effects without reducing

the beneficial effects of a drug. Cheng decided

to probe the mechanism of action of these

drugs. He found that side effects stemmed

from toxification of the mitochondria —

these organelles are energy powerhouses, and

they were suffering damage and declining in

number. Cheng’s findings made the next step in

drug development exceedingly clear: treating

diseases would require a targeted approach,

one that attacked the disease-causing agents

but kept all other cell parts working normally.

When he zoomed in his focus on cancer,

Cheng realized it was unlikely that a single

chemical would be sufficient. Cancerous

tissue is heterogeneous, which means one

compound is unlikely to affect an entire

18 March 2015


medicine

FOCUS

population of cancer cells. “It was clear that

a new paradigm needed to be developed

as to how to fundamentally address cancer

treatment,” Cheng said.

His solution was to turn to the human body’s

immune system, which shifted the focus from

treating cancer from without to exploiting the

body’s own internal mechanisms of healing

and defense.

With this more holistic view of cancer,

Cheng thought that multiple targeting agents

would be needed in combination, since it was

improbable that one compound would succeed

on its own in killing the mixed cancer tissue.

Identical treatments often had varying degrees

of effectiveness in different patients, leading

Cheng to look to historical medical practices

for clues that would hint at better treatment

options. While reading about ancient remedies

still in use today, Cheng discovered that

Chinese medicine had been using the multiple

target approach for generations.

Armed with this insight, he investigated

roughly 20 herbal combinations that are

still in use but have ancient roots. These

home remedies have been used to address

symptoms such as diarrhea and nausea, and

Cheng believed they could also prove useful

in reducing the side effects of cancer treatment

without disturbing the important work of

chemotherapy.

Mechanism behind the magic

Cheng’s lab is also working to understand

why and how the combination of herbs in

PHY906 is so effective. Current data points to

two primary mechanisms.

The first proposal suggests that the recipe

works as an inflammation inhibitor. All three

major inflammatory pathways in the body

seem to be affected by the presence of PHY906,

suggesting that the herbs have multiple sites

of action within the body. By addressing all

three pathways, the results of PHY906 are

better than any anti-inflammatory drug on the

market today.

Interestingly, there was one class of

ancient Chinese diseases that translates to

“heat.” Diseases in this subset were related to

inflammation. When traditional remedies

prescribed to treat “heat” diseases were

screened against six well-characterized

inflammation pathways, more than 80 percent

of the herbs in this treatment category showed

activity against at least one pathway. When

a different class of herbs was tested against

inflammatory pathways, only 20 percent


showed any relation. The oral tradition seems

to correlate with strong scientific results.

The other mechanism of action for PHY906

could be that the herbal combination enhances

the recovery of damaged tissue by increasing

the rate of propagation for stem cells and

progenitor cells. Both cells tend to differentiate

into different target cell types depending on

what is needed. By activating the expression of

genes in charge of the stem cell and progenitor

cell pathways, PHY906 can accelerate the

proliferation of new cells to fix damaged tissue.

These scientific suggestions for the magic

behind PHY906 offer some hope that the drug

could one day be applied to treat other diseases

besides cancer. In keeping inflammation in

check, for example, the ancient herbal remedy

could prove useful in mitigating symptoms of

colon inflammatory disease.

Bridging the East-West gap

Another significant impact of PHY906, one

that Cheng hopes continues growing in the

future, is its role in the convergence of modern

and traditional medicine. An integrative

approach to treatment considers all options

and explores the potential of compounds

new and old. Cheng’s work is one example of

a shift in perspective that may be essential in

unlocking mysteries of modern medicine.

Traditional remedies cannot be discounted.

They would not have survived generations

without proving efficacy time and time

again. As medicine is forced to confront

increasingly complicated diseases — from

neurodegeneration, to diabetes, to metabolic

syndromes — it is imperative that medical

professionals explore all avenues, including

those that already exist but need to resurface.

“The etiology of these complex syndromes

and illnesses is not singular; they are caused

by many different genetic and environmental

factors,” Cheng said. “Thus, it is impractical to

[have a singular focus] as we pursue solutions.”

There are certainly concerns to be raised

over the application of ancient home remedies

in medical practice, but Cheng’s lab keeps all

research up to stringent standards. The team

operates under good manufacturing practice,

ensuring that each batch of PHY906 maintains

the same chemical properties. One of the key

issues with natural medicine is deviation in

ingredients — each is grown from the earth

and not in the lab, which means it may have

a slightly different chemical composition every

time it is used. Good manufacturing practice is

a precise process that specifies exact minutes,

concentrations, and temperatures for each step

in drug development.

What makes Cheng’s product unique is

that no other pharmacological institution has

created such precise rules and regulations for

the manufacture of a traditional remedy.

And it is a remedy he truly believes in. “It is

our job to figure out the right combinations

to solve our problems,” Cheng said. PHY906

could be a leap forward in cancer treatment,

and it only came to light through openminded

research. Cheng’s Consortium for the

Globalization of Chinese Medicine emphasizes

collaboration. Consortium members come

together in dealing with the challenges of

working with traditional treatments and share

quality control regulations as well as sources of

herbs.

Cheng is devoted to PHY906 and to

integrating eastern medical remedies with

western research practices. “Moving forward,”

he said, “scientists need to take advantage of

the naturally occurring library of chemicals

that Mother Nature has provided us.”

ABOUT THE AUTHOR

MILANA BOCHKUR DRATVER

MILANA BOCHKUR DRATVER is a sophomore mollecular, cellular, and

developmental biology major in Jonathan Edwards College. She serves

as the volunteer coordinator for Synapse.

THE AUTHOR WOULD LIKE TO THANK professor Cheng for his time

and enthusiasm about sharing his research.

FURTHER READING

Lam, et al. “PHY906(KD018), an Adjuvant Based on a 1800-year-old Chinese

Medicine, Enhanced the Anti-tumor Activity of Sorafenib by Changing the


November 2015


19


Nature's

BLUEPRINT

BY GENEVIEVE SERTIC

Solar cells inspired by plant cells

Art by Chanthia Ma

At first glance, nature and technology

may seem like opposites. Leaves

stand in contrast to circuits, birds to

airplanes, and mountains to skyscrapers. But

technology has a history of taking cues from

nature. Velcro was inspired by burdock burrs,

while aircraft were modeled after bird wings.

The Shinkansen Bullet Train was constructed

with the kingfisher’s beak in mind. A closer

lens on nature unlocks tremendous potential

for technological innovation, and plant cells

are no exception.

Yale researchers are now looking to plant

cells in order to improve the design of solar

power, touted as a carbon-free alternative energy

source. At the heart of solar power are solar

cells, which, like plant cells, aim to absorb sunlight

and turn it into a useable form of energy.

André Taylor, associate professor of chemical

and environmental engineering, and Tenghooi

Goh, a graduate student in the School of Engineering

and Applied Science, worked with

their team to develop an organic solar cell that

mimics the chemistry of plant cells.

Most solar power today relies on silicon solar

cells, which do not precisely parallel plant

cells. When sunlight hits a silicon solar cell, an

electron jumps across the material and moves

through a wire to generate electricity. Plant

cells instead take the light energy and transfer

it to a protein through a chemical process.

These cells from nature can inform optimal

materials for use in organic solar cells, as the

Yale group discovered. Organic solar cells are

relatively new in the field of solar energy. There

are many different types, but generally speaking,

organic solar cells are lighter, less costly,

and have more environmentally-friendly manufacturing

processes than their traditional silicon

counterparts.

At this point, the choice of a solar cell probably

seems obvious. But organic solar cells come

with one major drawback: efficiency. Solar cell

efficiency refers to the amount of electricity

generated relative to the input of sunlight energy.

While silicon cells have achieved efficiencies

of more than 20 percent, organic cells are

lagging behind.

Taylor and his team sought to increase this

efficiency while maintaining the advantages

of organic solar cells. They blended together

two polymers with complementary properties,

aligning them to make them compatible. Together,

these polymers can absorb light from

much of the visible spectrum, which explains

their greater combined efficiency. The Yale

researchers managed to increase efficiency of

this particular type of solar cell by almost 25

percent.

The key to better solar energy, as it turns out,

lies in nature.

A three-part design

To turn light energy into electrical energy,

organic solar cells need a material that gives up

an electron — a donor — and a material that

takes that electron — an acceptor. However,

the donor polymer can only absorb a certain

range of light wavelengths. Wavelengths outside

of this range are wasted. The recent development

from Yale scientists allows an organic

solar cell to absorb a wider range: Adding another

donor that accepts a different but complementary

range of light wavelengths gets at

the efficiency problem directly.

These new types of solar cells are called ternary

cells, and they have three components:

two donors, one acceptor. Unfortunately, more

often than not, the two donors conflict with

each other and lower the overall efficiency of

energy conversion.

Polymers P3HT and PTB7 are two such incompatible

donors. They align in completely

different directions, with P3HT standing vertically

and PTB7 lying horizontally. In poorly-aligned

structures, charge recombination

occurs, wherein an electron meant to generate

electricity is reabsorbed into a hole in the material,

or a place where an electron could exist

but does not.

But not all hope was lost for P3HT and

PTB7. Taylor’s team noticed that the wavelengths

of light absorbed by the polymers are

in fact complementary — P3HT takes in bluegreen

light, while PTB7 is best at absorbing

light in the yellow-red spectrum. Overcoming

their incompatibility would allow for a much

more efficient ternary cell, and this is exactly

what Taylor’s team set out to do.

Finding agreement in incompatibility

In order to reduce the interference between

the two donor polymers, the team focused on


environmental engineering

FOCUS

a couple methods, including Förster resonance

energy transfer (FRET). FRET is a mechanism

by which two light-sensitive molecules, or

chromophores, transmit energy. This process

helps primarily in biological studies to trace

proteins as they travel through cells. It is also

one of the primary mechanisms in energy

conversion within a plant cell, and in fact, is

one of the reasons that leaves are so efficient

in converting sunlight into chemical energy.

FRET is not a topic normally brought up when

discussing solar technology, however. “It’s been

heavily used in biology, but never in polymer

solar cells,” Taylor said.

In this study, the researchers focused on

FRET between their two chromophores,

polymers P3HT and PTB7. Individually,

the efficiency of each polymer is not

particularly powerful. However, combining

the polymers facilitates FRET

and allows them to complement

each other, resulting in an efficiency

of 8.2 percent — quite high

for a ternary organic solar cell.

Other groups have also

used various polymers in

conjunction, but never in

a way that forces the

polymers to interact.

Taylor’s team combined

P3HT and

PTB7 and created

a collaborationn.

One polymer

picks up emissions

from the

other. “We’re

the first group

to show that you can actually put these components,

these multiple donors, together, and

have them act synergistically,” Taylor said. The

polymers are complementary — one can recover

lost energy from the other, and together,

they can take in a much wider range of light.

This was among the most pivotal findings in

PHOTO BY GENEVIEVE SERTIC

The red solar cell, incorporating P3HT,

absorbs blue-green light, while the blue solar

cell, made using PTB7, best takes in light

from yellow to red on the visible spectrum.

The natural alignment of the two polymers


a single cell.

the Yale study.

To improve efficiency further, the researchers

focused on adjusting the incompatible

alignment of the polymers. Electron flow is impeded

between P3HT and PTB7. “If organics

align in a conflicting way, they will not allow

electrons to flow in a favorable direction,” Goh

said. A second method the team used, called

solvent vapor annealing, can fix that. The researchers

exposed the solar films containing

the incompatible polymers to vapor to help

the structures relax and smooth out. With

this technique on top of the special attention

to FRET, the organic solar cells achieved a remarkable

efficiency of 8.7 percent.

Strategizing for the future

This research is not only significant because

of increased efficiency. It also describes an innovative

process for overcoming mechanical

difficulties within organic solar cells. Even

after Taylor’s improvements to ternary cells,

organic-based solar power does not match the

efficiency of silicon-based solar power. However,

using their methods as a launching pad,

there is great potential to increase efficiency of

organic solar cells even further in the future.

“As people develop newer polymers, they

can use this study as a road map to create higher-efficiency

devices,” Taylor said.

This study shows that polymers labeled as

incompatible can be re-engineered to complement

each other and to increase solar cell efficiency.

It also illustrates that nature’s answers to

technological challenges are as relevant as ever.

Beyond their basic function of turning sunlight

into useable energy, plant and solar cells

might not seem related at first. Plant cells

convert sunlight into chemical energy, while

solar cells convert sunlight into electricity. But

the mechanisms by which plant cells absorb a

wide range of solar radiation are, as it turns out,

readily applicable to the choice of polymers in

organic solar cells. In fact, plant cells provide a

model that the Yale group found to be incredibly

helpful. The story of solar cells inspired by

plant cells introduces not only new technology,

but a new way of thinking about solar cell

efficiency that reflects our natural world.

ABOUT THE AUTHOR

GENEVIEVE SERTIC

GENEVIEVE SERTIC is a sophomore prospective electrical engineering

major and Energy Studies Undergraduate Scholar in Pierson College. She is

a copy editor for this magazine and works through Project Bright to promote

solar power on Yale’s campus.

THE AUTHOR WOULD LIKE TO THANK Dr. André Taylor and Tenghooi

Goh for their time and enthusiasm about their research.

FURTHER READING

Huang, Jing-Shun et al. 2013. “Polymer bulk heterojunction solar cells

employing Förster resonance energy transfer.” Nature Photonics 7: 479-485.

doi: 10.1038/nphoton.2013.82


November 2015


21


Computers master medieval texts

By Amanda Buckingham

Art By Chanthia Ma

Reading a medieval manuscript is like

getting a glimpse at another reality. Like a

window into another time, words written

centuries ago teleport the reader into the past. But

merely looking at words on a page would barely

scratch the surface of all there is to learn from a

medieval manuscript. How did the scribe write?

What inks were used? What is in the foreground,

versus the background? What makes studying

these texts especially challenging is the fact

that worn and aged manuscripts are extremely

delicate.

Bridging the gap between past and present,

however, is a thoroughly modern field: computer

science. Now, by merely opening another sort

of window — a web browser — you can access

millions of digitized images of manuscripts.

Advances in machine learning have allowed

computers to move beyond simply presenting

images of texts to quite literally reading them.

With a tool like optical character recognition,

a computer program can identify text within

images.

Still, computers are not medievalists. Medieval

manuscripts pose a particular problem for

computer-assisted research — the handwriting

style and state of preservation of the text

both limit the accuracy of optical character

recognition. In addition, recording the material

properties of a medieval manuscript is incredibly

time-consuming. The materiality of manuscripts

may obscure text over time, but it also betrays

the secrets of books: how they were made, and

by whom. Scientists and historians alike are thus

interested in discerning material properties of

old texts, and they need efficient, non-invasive

techniques that can handle the sheer size of the

medieval corpus.

To this end, Yale researchers have developed an

algorithm capable of sifting through thousands

of images to discern how many inks were used

in the scribing of each and every page of a

manuscript. Led by Yale professor of computer

science Holly Rushmeier, this project is one

component of an interdisciplinary collaboration

with Stanford University, known as Digitally

Enabled Scholarship with Medieval Manuscripts.

This algorithm in particular is driven by the

fundamental principle of clustering, which

groups pixels into specific categories. It gets at the

number of inks used in individual manuscripts,

but also offers efficiency in analyzing large

databases of images with quickness and accuracy.

While there are other computer platforms relevant

to the topic of medieval manuscripts, most focus

on simple methods such as word spotting and few

can efficiently capture material properties.

The question might seem simple on its surface

— how many colors did this scribe use thousands

of years ago — but the answer is quite telling.

A better understanding of what went into the

creation of medieval manuscripts can reveal

new details about cultures and societies of the

past. Rushmeier’s research takes a technical

approach to history, using computer science

and mathematical formulas to reach conclusions

about medieval texts. She hopes her findings will

aid both scientific and historical scholarship in

22 November 2015


computer science

FOCUS

years to come. Her work stands at the

intersection of science and history, and

tells a compelling story about texts of the

past and programs of the future.

Uncovering a manuscript’s true colors

Scholars have long been interested in

the colors used in medieval manuscripts.

In the past, researchers discerned variations

in the colors used in individual

and small groups of pages. But for an entire

manuscript, or in large comparative

studies, quantifying color usage by visual

inspection is not feasible. Computers, on

the other hand, can wade through thousands

of images without tiring.

Computers can assign values to different

colors, and can then group similar colors

into clusters. The K value, or number

of distinct inks on a page, can then be

determined. For many years, scientists

have been able to manually count

independent inks to obtain approximate

K values. In contrast, the algorithm

developed by Rushmeier’s team is an

automatic method of estimating K, which

is more efficient than prior eyeballing

techniques.

The computer scientists clustered three

types of pixels: decorations, background

pixels, and foreground pixels. Decorations

included foreground pixels that

were not specifically part of text, while

foreground pixels referred to words

written on the page. To test the quality

of their clustering method, namely its

accuracy in determining the number of

inks used per manuscript, the researchers

practiced on 2,198 images of manuscript

pages from the Institute for the Preservation

of Cultural Heritage at Yale.

To evaluate accuracy, the researchers

compared K values produced by the

algorithm to K values obtained manually.

In an analysis of 1,027 RGB images

of medieval manuscripts, which have

red, green, and blue color channels, 70

percent of the initial K values produced

by the computer matched the number

of inks counted manually. When the

value of K was updated after checking

for potential errors, the algorithm’s value

either matched the value determined

by eye or deviated by only one color 89

percent of the time. The scientists were

pleased to see such high accuracy in

their algorithm, and also realized the

importance of updating K to produce

results closer to reality.

Checking for errors is necessary because

even computers make mistakes,

and finding the K value for a medieval

manuscript page is no small feat. For

one, even a single ink color can display

a tremendous amount of variation. The

degradation of organic compounds in

the ink causes variations in the intensity

of pigment to multiply over time. Even

at the time of writing, the scribe could

have applied different densities of ink to

the page. “There’s the potential for seeing

differences that could just be from using

a different bottle of the same ink,” Rushmeier

said.

A computer runs the risk of overestimating

the number of distinct color

groups on a page. Without a proper

check, Rushmeier’s algorithm would produce

a K value higher than what is truly

reflected in the manuscript. Natural

variations in pigment color should not be

construed as separate inks.

What constitutes a cluster?

Medieval manuscripts have a high proportion

of background and foreground

text pixels relative to decorations. Before

the computer carried out clustering,

only the non-text, foreground pixels were

isolated. Differentiating between foreground

and background pixels required a

technique called image binarization. This

was the crucial first step in designing an

algorithm to calculate a K value, according

to postdoctoral associate Ying Yang,

who worked on the project.

The color image of the manuscript

page was converted into a gray scale

that had 256 different color intensities.

The number of pixels for each of the

intensities was sorted into a distribution,

and pixel values within the peak of

the distribution were deemed to be

foreground, while the rest were labeled as

background noise. In the resulting binary

image, foreground pixels were assigned

a zero, while background pixels were

assigned a one.

After the foreground had been differentiated

from the background, text had

to be separated from non-text pixels.

Incidentally, the handwriting in medieval

manuscripts lends itself to this task.

Yang noted that in medieval Western

Europe, text was written in straight bars.

“It’s as if they deliberately tried to make

each letter look like every other letter,”

Rushmeier said. Though this makes computer-assisted

research more difficult in

some respects, the team of Yale scientists

used the similarity of text strokes to their

IMAGE COURTESY OF HOLLY RUSHMEIER

In one step of the Yale study, foreground text pixels were detected and eliminated so that only the non-text pixels remained.


November 2015


23


FOCUS

computer science

These red rectangles indicate text ornamentation that was located and extracted from the images.

IMAGE COURTESY OF HOLLY RUSHMEIER

advantage.

Since the bar-like writing technique

of medieval scribes makes for fairly

uniform letters, the scientists used a resizeable,

rectangular template to match

and identify each pen stroke. First,

they gathered information about text

height and width from the binary image.

Once the size of the template had been

established, it was used to match with

text. Only strokes of a similar size to

the rectangle were given high matching

scores. Since ornately designed capital

letters were not of a similar size compared

to the rest of the text, they received low

matching scores.

Pixels with low matching scores that

were also valued at zero in the binary

image were deemed to be foreground,

non-text pixels that were candidates for

clustering. Once the candidates were

identified, they could finally be classified

into clusters. Of course this method meant

that high matching text was overlooked.

The algorithm had a built-in remedy:

the computer automatically added one

to the total number of clusters derived

from candidate pixels, which resulted

in the initial value of K. This ensured

that the text-cluster, itself representative

of the primary ink used in writing the

manuscript, was counted.

Of course this addition would have

lead to an overestimation of the K value

whenever any text pixels were erroneously

considered candidates for clustering. The

Yale team devised a clever solution to this

problem. The scientists compared the

color data for each of the K clusters with

the color of the text. A striking similarity

between one of these clusters and the

text would indicate that the cluster was

derived from misrouted text pixels.

The color of the text had yet to be

determined. To obtain this piece, the team

performed another round of clustering.

This time, all foreground pixels — text

and non-text — were deemed to be

candidates. Given the large quantity of

text pixels, the text-cluster was fairly easy

to spot. While the only new information

generated in this round of clustering was

the pixel color values of the text-cluster,

this detail was essential in ensuring an

accurate count of inks used on a page.

Importantly, the computer algorithm

had checks in place to add and subtract

from the K value depending on risk of

over or underestimation. It worked efficiently,

but did not sacrifice thoroughness.

In the end, the computer revealed

a well-kept secret of the medieval manuscript

by outputting an accurate value

for K.

The bigger picture

The algorithm was used to analyze

more than 2,000 manuscript images, including

RGB images and multispectral

images, which convey data beyond visible

light in the electromagnetic spectrum.

By calculating K more quickly, this program

offers a more directed research experience.

For example, scholars curious

about decorative elements — say, elaborately

designed initials and line fillers

within a manuscript — can focus on pages

with relatively high K values instead of

spending copious amounts of time filtering

through long lists of manuscripts. In

general, once K has been determined, the

non-text clusters can be used for further

applications. In detecting features such

as ornately drawn capital letters and line

fillers, the team had 98.36 percent accuracy,

which was an incredible, exciting

result.

Though the team is nearing the end of

current allotted funding, provided by the

Mellon Foundation, Rushmeier said the

group has more ideas regarding the impact

K could have on scholarly research. For

instance, with some modifications, the

algorithm could reach beyond books and

be repurposed for other heritage objects.

According to Rushmeier, in exploring

the material properties of medieval

manuscripts with computer science, we

have only “scratched the surface.”

ABOUT THE AUTHOR

AMANDA BUCKINGHAM

A junior in Berkeley College, Amanda Buckingham is double majoring in

molecular biology and English. She studies CRISPR/Cas9 at the Yale

Center for Molecular Discovery and oversees stockholdings in the healthcare

sector for Smart Woman Securities’ Investment Board. She also manages

subscriptions for this magazine.

THE AUTHOR WOULD LIKE TO THANK Dr. Holly Rushmeier and Dr.

Ying Yang for their enthusiastic and lucid discussion of a fascinating,

interdisciplinary topic!

FURTHER READING

Yang, Ying, Ruggero Pintus, Enrico Gobbetti, and Holly Rushmeier.

“Automated Color Clustering for Medieval Manuscript Analysis.”

24 November 2015


environment

FEATURE

ICELAND’S VOLCANIC ACTIVITY

TO INCREASE WITH CLIMATE CHANGE

BY ELLIE HANDLER

PHOTO BY STEPHEN LE BRETON

The Vatnajökull ice cap is the largest glacier in Iceland,

covering eight percent of the country’s landmass.

In 2010, there was a buzzworthy eruption of the Icelandic

volcano, Eyjafjallajökull. Its ash cloud caused a huge disruption

to air traffic, cancelling thousands of European flights for five

days.

In Iceland, the legacies of volcanoes and glaciers are

largely intertwined. Telling a story about one depends on an

understanding of the other. This was certainly true for the 2010

volcanic eruption, and it has great implications for the future. As

the planet suffers increasing climate change, a rise in Iceland’s

magma levels could spike volcanic activity.

How do volcanoes and glaciers — a dichotomy of hot and cold

— affect one another? Scientists can look at levels of magma,

or melted rock inside the earth, to predict whether a volcano

will erupt. Magma levels are a key indicator of underground

unrest. Although rocks melt at different temperatures based on

composition, rocks held at low pressures tend to melt at lower

temperatures. The massive weight of glaciers causes significant

pressure on the earth below, compressing the crust and pushing

down through the mantle, where magma forms. As glaciers melt

and their volumes decrease, they exert less downward pressure,

which allows the rock beneath to melt into magma more quickly.

Then, the increase in magma beneath the earth’s surface can have

a substantial impact on the volcanic activity above ground.

Several studies over the past decade have examined the rate of

magma formation as a result of deglaciation in Iceland. Located

along an Atlantic Ocean fault line and above a hot spot, Iceland

is a powerful source of volcanic activity. Glaciers are prominent

above its volcanic areas, posing a complicated geological problem

as deglaciation pushes forward with climate change. Warming

contributes to a faster melting of glaciers, a subsequent faster

melting of rock into magma, and the potential for more volcanic

eruptions. Climate change could spark such a series of events.

The first suggestion at a connection between deglaciation and

increased magma production in Iceland came in 1991. Two

scientists, Hardarson and Fitton, looked into deglaciation of the

late Pleistocene age and found a distinct correlation between ice

melting and magma formation. Another of the earlier studies,

published in 2008, focused on the Vatnajökull ice cap, the

largest ice cap in Iceland. The researchers found that the glacier’s

thinning and retreating caused roughly 0.014 cubic kilometers

of magma to form each year. As a result of this magma growth,

the researchers predicted an increase in volcanic activity under

the ice cap.

More recently, a 2013 study examined a larger area of the

mantle under Iceland’s crust. Led by Peter Schmidt of Uppsala

University in Sweden, the team used updated mathematical

models to understand how the mantle melts. The scientists

concluded that 0.2 cubic kilometers of magma melts each year

under Iceland’s crust — a figure that correlates to 0.045 cubic

kilometers of magma melting per year under the Vatnajökull

ice cap. These studies are not necessarily inconsistent. Rather,

the increase between their figures is due to an improved

understanding of how the mantle melts into magma.

According to Yale geology and geophysics professor Jeffrey

Park, the explosivity of a volcano is determined by the magma’s

chemical composition. Rocks with volatile compounds, such as

water, carbon dioxide, sulfur, and silica, melt and form magma

containing pockets of gas or liquid that cause explosive eruptions

with large ash clouds. Eyjafjallajökull’s eruption was dramatically

explosive because it included silica-rich magma that had been

sitting in the crust of the earth for hundreds of years. In contrast,

this year’s eruption of Bardarbunga, a volcano underneath the

Vatnajökull ice cap, has emitted about eight times as much

magma as Eyjafjallajökull, but without a massive ash cloud or an

explosive eruption due to variations in magma composition.

Many unanswered questions remain about how Iceland’s

volcanoes react to deglaciation. “We don’t know how much of the

magma being generated is reaching the surface,” Schmidt said,

referencing the difficulty of estimating the probability of future

eruptions. Moreover, the distribution of magma underneath

Iceland is still unclear to researchers, who know how much

magma is produced, but not where it goes. How long magma

remains magma is also uncertain, as it will eventually solidify to

become part of the earth’s crust. Finally, researchers are unable

to thoroughly predict the composition of magma in a chamber,

making it challenging for them to know which types of eruptions

to anticipate.

Deglaciation in Iceland is causing the melting of more magma

and increasing the likelihood of volcanic activity in Iceland. But

researchers are not sure exactly how the increase in magma

volume will affect the frequency or power of eruptions. For

now, scientists remain uncertain whether an eruption with the

magnitude of Eyjafjallajökull’s in 2010 will happen again in a few

years or a few decades. What they do know is that another major

eruption is surely on its way.


November 2015


25


An estimated 0.7 percent of power plants today use nuclear

power to sustain a whopping 5.7 percent of the world’s energy

and 13 percent of the world’s electricity. Despite the clear importance

of nuclear power plants, they do not operate without

risk. Indeed, on-site explosions can release radiation equivalent

to that of multiple atomic bombs — radiation that persists,

seemingly without end, for thousands of years.

Though nuclear plant explosions are uncommon, several

have occurred in recent decades. One of the most infamous

examples occurred in 1986, when a nuclear reactor exploded

in Chernobyl, spewing massive amounts of radioactive material

into the atmosphere. Shockingly, scientists estimate that

this explosion alone released a hundred times more radiation

than the atomic bombs dropped on Hiroshima and Nagasaki.

The explosion at Chernobyl is well known, but what is less

clear is exactly what makes radioactive matter so deadly. Radioactive

materials often come in the form of high-energy

photons, or tiny packets of energy, that can permeate through

matter impermeable to ordinary, less-energetic photons. Human

health is at stake when radioactive elements work their

way into cells. This dangerous material can cause deaths, deformities,

and cancers when it encounters bodily tissue, depending

on the type and amount of radiation released. The infamous

Chernobyl disaster, the deadliest unintentional release

of radioactive matter in history, accounts for just shy of one

IMAGE COURTESY OF REBECCA ABERGEL

Rebecca Abergel stands with her team of researchers.

million deaths to date.

Until recently, scientists have known little to nothing about

how cells take in high-energy radioactive materials. This past

July, a team led by Rebecca Abergel of the Lawrence Berkeley

National Laboratory in collaboration with Roland Strong

of the Fred Hutchinson Cancer Research Center discerned a

pathway for the cellular uptake of radioactive matter. With

this new insight, the researchers hope to bring a drug counteracting

radioactive health effects to the clinic. A solution that

assuages the bodily damage caused by high-energy photons

would aid those suffering from the aftermath of disasters like

Chernobyl. While it may be impossible to eliminate all radioactive

catastrophes, the goal for Abergel, Strong, and their colleagues

is to find better ways to respond to future disasters.

During nuclear reactions, many heavy elements autonomously

emit radiation. The fact that these heavy metals spontaneously

spew out photons of energy makes them highly

dangerous and intractable. The pathway identified by Abergel’s

team concerns heavy metals such as americium and plutonium,

classified in a group called actinides. The researchers

made a cluster of new discoveries, but most importantly, they

determined that a known antibacterial protein called siderocalin

is capable of carrying actinides into the cell.

Abergel and her group have a history of achievement in this

area. Before their discoveries about siderocalin, they had already

developed a molecule to isolate and subsequently remove

actinides from the body, which currently awaits approval

from the FDA. In a pill, the molecule may help to remove

some actinides from the body. However, its efficacy is limited

because it works mostly for metals that are still circulating,

not for metals that have already been imported into the intracellular

space. There was a gap in scientific knowledge of how

radioactive elements enter the inside of a cell, and Abergel was

determined to fill it. The limitation she noticed in her drug

helped motivate her group’s efforts to decipher a more mechanistic

understanding of how cells are contaminated with radioactivity.

Determining the precise role that siderocalin plays in the

cascade of events leading to actinide absorption was a challenging

task. The researchers combined experimental techniques

spanning different disciplines, from heavy-metal inor-

26 November 2015


ganic chemistry to structural biology. They hypothesized that

siderocalin might be a good protein to investigate because of

its known role in the sequestration of iron, a lighter metal,

in the cell. However, they were uncertain whether siderocalin

could carry heavier metals such as actinides — no structures

of protein-heavy metal ion complexes have ever been cited in

scientific literature.

But the team hypothesized correctly, and found that siderocalin

can indeed transport metals heavier than iron. First,

Abergel’s group created crystals that each contained many

identical snapshots of siderocalin in the action of carrying an

actinide ion. Next, the team took its crystals to the Advanced

Light Source, a synchrotron X-ray source owned by the Department

of Energy and located at Berkeley Lab. There, the

researchers fired X-rays — high-energy photons — at their

crystals.

Because the wavelength of an X-ray is approximately the distance

between the atoms in these crystals, X-rays were unable

to pass through untouched. Instead, they were bent, or diffracted,

by the varying electron densities at different points in

the crystal. The extent to which these rays were bent created

what is known as a diffraction pattern that contained an abundance

of exploitable information about the crystal’s structure.

With further mathematical analysis of their diffraction patterns,

Abergel and her team inferred the original regions of

high and low electron density in their crystals. From this data,

the group constructed atomic models that specify the original

structures of siderocalin attached to different heavy metal ion

complexes. These atomic models help to explain the mechanism

for cellular uptake of actinides. In general, the structures

suggest that first, smaller molecules recognize actinides in the

cell and form complexes around the heavy metal ions. Then,

siderocalin recognizes these complexes and shuttles them further

into the cell to be absorbed.

The group’s discoveries did not stop there. While searching

for a mechanism for the cellular uptake of heavy metals, Abergel

and her team also found a way to readily identify the

presence of these metals in vitro, or in a test tube rather than

a living cell. It was truly a testament to science and serendipity.

The researchers discovered that the crystals they originally

prepared actually luminesced under exposure to ultraviolet

ART BY HANNAH KAZIS-TAYLOR

light. Through a series of follow-up tests, the team demonstrated

that siderocalin can also act as a synergistic antenna

that causes heavy metals to glow much more brightly than they

would if exposed to ultraviolet light in their bare form. This

discovery highlights potential applications for siderocalin in

the field of bioimaging, which relies on luminescent signals in

a variety of scenarios.

With new knowledge about siderocalin and actinides, Abergel’s

team hopes to improve the lives of many who have been

exposed to radioactive materials.


November 2015


27


FEATURE

robotics

Robots

with

Electronic

By Caroline Ayinon

Art By Ashlyn Oakes

Skin

The race to develop viable, efficient robotic skin is on.

Such a technological triumph could make robots more

durable for use in a variety of settings, and could

even pave the way for improvements in human prosthetics.

Currently, an innovative and versatile material called graphene

appears to be the front-runner in this race. A research team at

the University of Exeter has developed a new way to produce

graphene that could allow for the creation of electronic skin.

Graphene is an incredibly versatile material that is just one

carbon atom thick — so thin that researchers consider it twodimensional.

An ultra-thin graphene sheet is transparent,

absorbing just 2.3 percent of the light that hits it. Graphene

is also an excellent conductor of electricity. Since electrons

are able to travel through it with virtually no interruption,

it conducts electricity up to 200 times faster than silicon, a

material it commonly substitutes. And while graphene can be

easily engineered into a soft powdery substance such as pencil

graphite, its flat honeycomb pattern also makes it the strongest

material in the world.

While scientists began to study the concept of graphene as

early as the 1940s, many then believed that the isolation of a

two-dimensional material was physically impossible. Graphene

did not come to the forefront of research initiatives until 2004

and 2005, when papers from the University of Manchester and

Columbia University published descriptions of its versatile

properties. Soon after, a team at Manchester isolated layers

of the material 10 carbon atoms thick from graphite using

a mundane product: tape. Later, the same team refined this

method to isolate a single layer using more advanced tools.

With the ability to synthesize graphene into layers, researchers

began to discover rich possibilities for the material. Graphene

layers stacked on top of each other and rolled to form carbon

nanotubes are starting to appear in tennis rackets, bicycles,

and 3D printed organs. When these same layers are wrapped

around each other, graphene can form spherical carbon

molecules called fullerenes, which are currently the focus of

While graphene can be

easily engineered into a

soft powdery substance

such as pencil graphite,

its flat honeycomb pattern

also makes it the strongest

material in the world.

many research studies because of their use in drug delivery.

Since graphene’s structure contains flexible carbon bonds, it

can bend and stretch in a multitude of ways without breaking,

opening up further possibilities for its use in devices such as

phone screens and plasma televisions.

Now, a group of University of Exeter researchers led by

28 November 2015


obotics

FEATURE

Monica Craciun has discovered a new technique for graphene

synthesis that could revolutionize the use of this material.

Recently published in Advanced Materials, the new method —

called resistive-heating cold-wall chemical vapor deposition —

is an improved version of the currently used regular chemical

vapor deposition technique, or CVD. Traditional CVD relies

on the use of a specific substrate to collect a deposit of gaseous

reactants. This process involves heating coils of copper inside

a quartz furnace to about 1,000 degrees Celsius for several

hours, which requires a lot of energy and produces a lot of

methane gas. CVD has been used and modified for several

years, but up to this point, the process has been too costly and

painstaking to be widely used.

The new resistive-heating cold-wall CVD is a simplified

version of the time-tested CVD method. Craciun and her team

were able to modify the process to selectively heat just the

copper foils, eliminating the need for hydrocarbons required

in the older version. This method shortens the entire reaction

and erases the dangerous output of methane gas.

Since resistive-heating cold-wall CVD hinges on a concept

that has already been used with much of the same equipment

to manufacture other materials, it could be employed

economically. Manufacturers entering the graphene industry

would not have to spend money on new facilities and would

instead be able to mass-produce the material with machinery

that is already available. Furthermore, Craciun’s technique is

a much simpler process and synthesizes graphene of the same

quality at a rate that is 100 times faster and 99 percent cheaper.

Using their improved graphene synthesis technique, Craciun

and her colleagues developed the world’s first flexible,

transparent touch sensor. Working with another Exeter team

led by Saverio Russo, they found that molecules of ferric

chloride inserted between two layers of graphene enable a

transparent conductor system that could replace silicon and

other materials in flexible electronics such as touch screens

and LCDs. In these devices, touch sensors provide the main

interface for detecting human input. When compared to the

touch sensors widely used today, the graphene-based sensors

developed by Craciun’s teams have exhibited some of the fastest

response times and most acute sensitivity to human touch.

Improvements in graphene synthesis could also enable

researchers to create flexible, sensitive skin that would

transform robotics technology. The machines that we associate

with the term “robot” most typically have rigid, metal shells.

While these hard-skinned robots have enormous capabilities

in a wide range of fields such as space exploration and warfare,

their inflexibility makes them susceptible to damage such as

breaks and scratches. To avoid such damage, researchers have

recently begun to develop robots made from softer materials

such as plastic and rubber, which allow robots greater

flexibility in avoiding obstacles and navigating through tight

spaces. However, these softer materials are fragile and have

also proven to be relatively inefficient in protecting robots

from damage.

This is where the graphene-based touch sensor skin would

come in. Similar to human skin, it would offer a great balance

between protection and flexibility and would allow the robots

a vast range of movement. Additionally, it could respond to

external stimuli from the environment and could guide the

robot’s responses just as neurons in our skin do. Specific

algorithms would govern the robot’s responses to various

physical stimuli, extending its perceptual capabilities. The

algorithms would interpret and analyze the information

received by touch sensor skin and would use it to guide the

robot’s resulting actions.

With soft, electronic skin, robots could prove more useful

in areas such as search-and-rescue missions, where hazardous

and unpredictable environments pose a threat to both humans

and currently available robots. Robots with tough artificial

skin could survive large jumps or falls, bend or stretch as

necessary to make it through difficult openings, and avoid

major harm in the process. For similar reasons, these next

generation robots could also see a potential application in the

exploration of the moon and space.

Another even more powerful application of Craciun’s

discovery is the potential use of her new method in research

pertaining to the development of artificial skin in human

prosthetics. Materials currently used in prosthetics have been

unable to replicate the hysteresis curve of human skin — the

way skin reacts to pressure forces. Graphene-based touch

sensor technology may just hold the answer.

An innovative concept, resistive-heating cold-wall CVD

has attracted a lot of attention from engineers and scientists

around the world. With its simplified production process, it

may just prove to be the future of many fields of technology

and engineering. What awaits is a world of extremely precise

touch screen electronics and robots with skin as sensitive and

intelligent as ours.


November 2015


29


FEATURE

computer science

COMPUTER ANALYSES PREDICT

ONSET OF PSYCHOSIS

BY KENDRICK MOSS UMSTATTD

Many view mathematics and language as two distinct areas of

study. But what if math could shed light on the significance of

the speech patterns of someone at risk of developing psychosis?

A recent computer algorithm developed by Guillermo Cecchi

of IBM and Cheryl Corcoran and Gillinder Bedi of Columbia

University demonstrates that mathematical speech analysis

can lead to some fascinating findings.

Schizophrenia, which afflicts approximately one percent of

Americans, is one such disease that can be better understood

with the use of speech analysis. The condition is characterized

by a number of symptoms, including psychosis — a feeling

of being detached from reality — and speech that deviates

from normal patterns. People with schizophrenia often have a

difficult time staying on one train of thought, instead jumping

from one topic to another as they speak.

Although psychologists have made great strides to better

understand the composition of a brain with schizophrenia,

there has been a comparative lack of information about the

behavior of those at risk of developing psychosis later in life.

Currently, the primary interviewing method for predicting

psychosis relies on human analysis of speech patterns. With

a 79 percent accuracy rate, this method is fairly reliable — but

what if its accuracy could be increased to 100 percent?

As the most objective and meticulous of analyzers, computers

could achieve this perfect record in predicting psychosis from

speech. Corcoran, who has a background in schizophrenia

prognosis, said that although a researcher speaking to a group

of teenagers cannot tell who will develop schizophrenia, a

computer can pick up subtle language differences among the

group. In a study of 34 people, computer analyses of speech

patterns in interviews perfectly predicted which five of the

patients would later develop psychosis.

To conduct this computer analysis of speech, researchers

first had to establish a paradigm of normal speech patterns.

They studied word relations from famous works of literature,

including Charles Darwin’s On the Origin of Species and Jane

Austen’s Pride and Prejudice. For example, the words “chair”

and “table” were classified as related because they often

appeared in close proximity in writing and speech, whereas

“chair” and “dragon” were not related because these words

almost never appeared together.

Using this understanding of word relations, a computer

could analyze the speech from patient interviews to examine

complexity and semantic coherence, or the relation of adjacent

words in a sentence. The computer analysis then created what

Cecchi describes as a syntactic tree of the patients’ speech

patterns. The more cohesive and complex the speech, the more

elaborate the tree — and the more likely that the patient would

continue to behave normally. However, choppy, tangential

speech — represented by a short tree with underdeveloped

branches — indicated that the patient had a relatively high

likelihood of later developing psychosis. This speech analysis,

coupled with examination of the patient’s behavior, could

provide researchers with a more holistic understanding of

psychosis.

The next step for these researchers is to validate the

results with a larger sample size. Once this is completed, the

possibilities for implementing the research are broad. The

study’s results not only shed light on the condition of those who

suffer from psychosis, but also provide a better understanding

of the general population’s mental state. “[Psychosis is] just

one end of the spectrum,” Cecchi said. “We all express these

conditions, and they form part of our mental life.”

With this knowledge, artificially intelligent robots could be

designed to more accurately represent the way people think

and act. The research could also be applied in medical care:

While search engines are optimized for individuals and social

media pages offer streams of personalized updates, there is not

yet an app that provides diagnoses for users based on whether

their speech is slurred. Beyond behavioral tracking, cell

phones could also be equipped with physiological-monitoring

capabilities to better track users’ heart rates or record their

brainwave activity.

This research could be meaningful in scientific efforts to

understand other elements of the human condition. The next

step is to determine what questions about speech patterns

need to be answered, and which speech variables can answer

these questions. Intonation or cadence, for example, may

be missing links in our understanding of a psychological

condition. Where will the results take us? If math continues to

be used as a key to unlocking the patterns behind behavior, the

possibilities seem endless.

ART BY ALEX ALLEN

30 November 2015


I

DEBUNK NG

SC ENCE

BY RAUL MONRAZ

Last spring, an M9.6 earthquake wreaked havoc in California. The

long overdue, gargantuan quake leveled the cities along the infamous San

Andreas Fault line. Los Angeles, San Francisco, and their surroundings

were plunged into chaos. Unleashing violent tremors from deep beneath

the earth, the disaster triggered fires, power outages, and the mother of

all tsunamis.

Rather than being petrified in fear, our Californian peers can assure

us that they witnessed this catastrophe over popcorn and soda from the

safe vantage points of darkened movie theaters, confident that Dwayne

“The Rock” Johnson would save the day. San Andreas, Hollywood’s

latest natural disaster blockbuster, played on the anxieties of many West

Coast denizens by offering a glimpse of what is to come when the next

anticipated mega-earthquake actually hits.

Not counting Johnson’s unlikely stunts, the film got most of the

generalities of emergency protocol right. As disasters strike throughout the

film, characters know to drop immediately to the ground and hide below

sturdy objects. Characters recognize the sea’s drawing in as a predictor of

an incoming tsunami. Early warning systems cry loud across the coast,

saving many lives by goading people up to higher ground. Fans watching

San Andreas get a rudimentary course in emergency management:

“What to do when Seismic Hazards, Inundations, and Tsunami hit you.”

Nevertheless, this film would probably not be a box office hit without

some well-done, albeit hugely exaggerated, CGI. The dramatic implications

of unrealistic events are enough to cause moviegoers to gawk in awe.

With the aid of movie magic, the film perpetuates three big scientific

inaccuracies: the magnitude of the earthquake and its consequences, the

size and very occurrence of the tsunami, and the existence of a high-tech

magnetic pulse model for predicting earthquakes.

In the movie, even the first earthquakes — between 7.0 and 8.0 on the

Richter scale — produce much more damage than they would in reality.

Additionally, seismic waves in the film violently shake and collapse the

majority of city buildings; with gross inaccuracy, an M7.1 quake obliterates

the Hoover Dam. In 2008, a panel of U.S. Geological Service experts

modeled the impact of a big earthquake in the southern California area.

The project predicted major structural damage, but mostly on buildings

that fail to comply with building codes or that have not been adapted to

withstand earthquakes. In all, few buildings would come to the point of

total collapse, and most would be within 15 miles of the San Andreas

Fault, rather than spread far and wide.

Still, viewers who have not experienced a major earthquake themselves

may take the destruction simulated in San Andreas at face value, since

real-world media outlets similarly dramatize disaster damage. In their

coverage, the buildings shown are typically those that have sustained

the most damage during earthquakes, rather than those that have been

left mostly unscathed. Even the most devastating earthquakes, such

as an M7.9 one that afflicted Nepal last April, did not cause a majority

of buildings to collapse. A survey by the Nepali Engineers Association

found that only 20 percent of buildings sustained major damages from

the quake. About 60 percent of the buildings struck down in the area

were masonry-built and lacked steel structures, construction methods

outlawed in California since 1933.

The film really starts to wander into fiction when Paul Giamatti’s

character, a purported geological expert, goes on national television to

announce the onset of a “swarm event,” a string of unfolding earthquakes

rippling from Nevada to San Francisco. According to his “magnetic

earthquake prediction model,” the geologist warns Americans that “The

Big One” will ultimately strike San Francisco with magnitude 9.6. Its force,

he says, will be such that “the earth will literally crack open.”

Earthquake swarms are real, several earthquakes may in fact occur

within a relatively short period of time. However, swarm event earthquakes

typically fall within a given magnitude and do not have a distinguishable,

main earthquake. While a swarm could account for the multiple quakes

in the movie, the San Andreas quakes have magnitudes far higher than

those typical of real-life swarm earthquakes. For comparison: a swarm of

101 earthquakes took place from July to November in Nevada last year,

with a maximum magnitude of 4.6 — far milder than the M9.6 quake

predicted in San Andreas.

When it comes to tsunamis, even a small one is extremely unlikely to

happen. The San Andreas Fault is located inland, far away from the coast.

An earthquake must occur in the ocean floor or at least close to the sea for

a tsunami to occur. Finally, the magnetic pulse predicting model is so far

— unfortunately — only science fiction. If such a model existed, it would

have been implemented already, as predicting these natural disasters

would surely save many lives.

While we can appreciate San Andreas’ wake up call for preparedness,

its science is implausible. The movie crosses into science fiction by

greatly exaggerating the destructive power of a natural phenomenon and

blatantly conjuring up impossible scenarios. Californians, you need not be

Hollywood superstars to weather The Big One — just educate yourselves

on earthquake safety and be ready.

IMAGE COURTESY OF NEW LINE CINEMA

Dwayne “The Rock” Johnson stars in the 2015 summer blockbuster

San Andreas. The movie was a dramatic take on what would

happen if a major earthquake struck the West Coast, complete with



November 2015


31


FEATURE

electronics

Sørensen’s and his team were after a substance that naturally

organizes into well-defined layers. They wanted something that

would not only sandwich thin films of electronic components, but

would also align these components in the same direction. Here,

they turned to soap.

This may seem like a surprising choice, since day-to-day experiof

electronics

BY NAAMAN MEHTA

Throw a potpourri of transistors into the bathtub, add some

soap, and out comes a fully formed nanocomputer. Science

fiction? Maybe not. Nanoscientists dream of coaxing

electronic components to self-assemble into complex systems.

In fact, researchers at the University of Copenhagen have taken a

major step towards making self-assembling electronics a reality.

In August, the researchers — many of whom were first-year

undergraduate students at the time of the work — reported that

they had successfully induced randomly oriented molecular

components to organize themselves into uniform sheets. At a time

when electronic components are so small that it is a formidable

challenge to position them accurately, self-assembly presents an

elegant solution. Soap was the key ingredient to their success,

forming thin films that sandwich the target molecules and precisely

guide their orientation.

“Imagine you have a billion nanocomputers but they are all

randomly oriented. You can’t harness the incredible computing

power, nor can you ‘plug in’ the keyboard, the mouse, or the screen,”

said Thomas Just Sørensen, leading investigator on the study and an

associate professor at the University of Copenhagen. “We need [the

nanocomputers] to be orientated in the right way to each other, and

that’s what our work seeks to accomplish.”

The promise of self-assembly

Nature provides inspiration for the flurry of work on selfassembly.

From the aggregation of phospholipid molecules

into cell membranes to the association of protein subunits into

nanomachines that churn out energy when we metabolize sugars,

nature creates elegant and intricate structures. These structures

form spontaneously, without outside intervention — the tendency

to self-assemble derives from the nature of the materials themselves.

Self-assembly holds great appeal given that electronic components

have become incredibly small. Currently, the transistors that make

up computer chips are positioned and wired together on circuit

boards using light, but this top-down approach is limited by the

light’s wavelength. With bottom-up nanoscience and the right

materials, the building blocks could do the hard work of assembly

themselves.

Besides, self-assembling materials are more resilient than their

traditional counterparts. If they can self-assemble once, it is

generally safe to assume that they can self-assemble again upon

suffering any damage. “If you break part of the material, there will

be some kind of self-healing effect,” Sørensen said.

According to Sørensen, display technology is one field where selfassembling

electronics promise a big splash. “All the technology in

our smartphone is remarkably robust except for the screen,” he said.

“There are no movable parts really. So you can hit it with a hammer,

and if the screen doesn’t break, probably nothing else will.”

Soap: The magic ingredient

32 November 2015


technology

FEATURE

ence suggests that mixing soap and circuitry is a bad idea. But the

molecules that make up soap are excellent at forming layers (think:

soap films). The water-loving ends of these molecules tend to stick

together, as do their water-fearing tails. These films, the researchers

hoped, would provide a regular template to guide the orientation of

all molecular components added.

Not just any type of soap will work. As the team found, soap

molecules found in common items such as shampoo and toothpaste

lack the required rigidity to hold the molecular components tightly

in place. Eventually, the group settled on a more grease-loving soap,

benzalkonium chloride, which also happens to be an anti-fungal

drug.

The team produced impeccably organized structures simply by

mixing these soap particles with a range of dye molecules. The soap

molecules quickly sought out other soap molecules and organized

into thin films that effectively glued together layers of dye molecules.

Even more impressive: the dye molecules oriented themselves in a

common direction, lying flat on their sides just as a layer of bricks

would pave a walkway.

Still, Sørensen estimates that self-assembling electronics may be

more than ten years away. In this proof-of-concept experiment,

the researchers did not work with actual electronic components.

Instead, they substituted similarly sized dye molecules. The

nanomaterials they produced do provide insight into how soap

organizes other molecular components. These materials may have

interesting conducting properties in their own right — but they are

not functioning electronic parts.

Even so, finding a material that can interact with other molecules

to produce these elegant sheets is a leap forward, Sørensen said.

Better yet, the scientists have already replicated their results.

Working with a range of dyes with many different shapes, the team

has observed self-assembly in 16 different nanomaterials. The

Copenhagen scientists, among other researchers, are now chasing

the next big break: translating these results to make functioning

electronic parts.

A different philosophy to science education

The Copenhagen team’s work represents a breakthrough in

science education as much as it does a breakthrough in science.

This research grew out of coursework completed by first-year

undergraduates as part of a laboratory class. Instead of conducting

run-of-the-mill experiments, these freshmen enrolled in the

university’s nanoscience program and had the opportunity to

dedicate themselves to a modern engineering problem.

For Ida Boye, who took part in the fourth year of this research

and who will be graduating this year, it was thrilling to realize that

no one yet knew the answer to the questions that the team was

tackling. “You have to think for yourself and try to come up with

ideas, because there is no textbook telling you what is right and

wrong,” Boye said.

Aske Gejl, one of the second batch of students now completing

his master’s degree in nanoscience, credited this experience for his

continued passion for research and inquiry. “The project still stands

as one of the most important during my time at the university,

as this was the first time I was trusted and enabled to aid in the

progress of real scientific work. This only fueled my desire to strive

for an academic career,” Gejl said.





Getting first-year students involved in research, pushing them

to confront important questions, and having them see their work

published in journals was a major achievement for the university

staff, Sørensen said.

“The university is not just a teaching institute but also a research

institute, and it’s important that [students] get to see this other side

of the university,” he said.

Towards functional electronics

IMAGE COURTESY OF JES ANDERSEN/UNIVERSITY OF COPENHAGEN

Sørensen is already looking ahead. As the classroom experiment

moves into its sixth iteration, he hopes that the incoming batch of

students will be able to build upon the existing work and produce

functional self-assembling electronics.

Research teams elsewhere are hard at work trying to produce

these layered devices by self-assembly. These groups typically work

with larger compounds known as polymers instead of the smaller

soap molecules that have worked so well for the Copenhagen teams.

Sørensen explained that it might be easier to plug electrodes into

materials made using long polymer strands, which take the form of

boiled angel hair pasta: Each strand serves as a conducting wire, and

it suffices to use an alligator clip that contacts the material at any

two points. Soap films, on the other hand, require contacts small

enough to pinch each individual film layer — just nanometers thick

— and engineers have not yet developed electrodes that small.

Sørensen’s class, however, will continue to work with soap. He

believes that there is value in pursuing this different path, especially

for the first-year students who can afford to take bigger risks because

they have less at stake.

One way or another, Sørensen said, self-assembly will deliver.

“One day, we’ll be able to spread a thin layer of solar cells on the

window and start generating solar power,” he said.

Self-assembling computers may still be a nanoscientist’s fantasy for

now, Sørensen conceded. But as these remarkable films effortlessly

organize tiny components with a dexterity that has eluded man’s

best efforts, he is content to look on in wonder, marveling at how

soap and self-assembly could shape the future of our most advanced

electronics.

November 2015


33


FEATURE

engineering

WHO LIVES ON A DRY SURFACE

UNDER THE SEA?

BY AVIVA ABUSCH

April showers bring May flowers — and a host of other

problems. After donning what is marketed as a water

resistant rain jacket and wielding an umbrella to battle

the elements, there is nothing quite as disheartening as

feeling rain soak into a dry layer of clothing, or knowing

your thin backpack containing several textbooks and a

computer is being slowly saturated. What if rain gear

was scientifically incapable of getting wet, and was

actually able to repel water? Researchers at Northwestern

University are exploring this question as they work to

develop a material that stays dry underwater.

The research team, led by Northwestern mechanical

engineering professor Neelesh Patankar, began by looking

for properties that would allow a surface to be immersed

in water but emerge completely dry. Drawing inspiration

from water bugs, whose fine leg hairs repel water by

retaining gas pockets, the scientists began constructing

a material that keeps water away using microscopic or

nanoscopic ridges. The goal of their research was to

harness this fantastic feat of natural engineering.

First, they had to find the critical roughness scale for

gas trapping in a man-made material — the correct width

and spacing for ridges on a textured surface such that

they could trap gaseous air and water vapor in between.

This design would force water to cling to the peaks,

rather than touching the material itself.

Of course nothing in science is ever quite so simple,

as the team found in attempting to engineer a perfect

texture. There was little to no research available on how

to create an effective surface roughness to deflect water.

And for the material to stay dry, the gas contained in

the valleys of the ridges would have to remain trapped

indefinitely. As pioneers in their field, Patankar and his

fellow researchers went through a series of experiments

to find the optimal distance between ridges.

Initially, their sample materials containing microscopic

ridges lost their ability to deflect water after only a few

days. By putting several samples through aging and

degassing trials, they discovered that their ideal material

needed even smaller ridges — on the nanoscopic scale.

In fact, the material that successfully withstood the

degassing trials had ridges roughly 10 times smaller than

the width of a strand of spider silk.

The discovery that they needed to work on the nanoscale

was a turning point for the researchers. According to

Patankar, they noticed that once the valleys dipped below

one micron in width, the pockets of water vapor created

due to underwater evaporation and effervescence finally

withstood the test of time. The trapped gas continued to

successfully deflect water, even after the scientists made

multiple attempts to dislodge it.

Beyond a future of water-repellant backpacks and

umbrellas, the material created by Patankar’s team has

the potential to change and economize major world

industries. Because water cannot stick to this surface,

it could revolutionize plumbing, especially in big cities.

In pipes lined with the material, the drag that currently

occurs due to the interaction between the interior of

the pipe and the fluids within it would be eliminated,

meaning water and liquid waste could be transported

much faster. Additionally, the material could be used to

make more weatherproof roof tiles and house sidings.

This would greatly reduce the frequency with which

homeowners have to undergo costly renovations for

basic maintenance, and could have a lasting impact on

both architecture and realty.

These tiny ridges offer tremendous possibilities. Their

applications are limited only by engineers’ imaginations.

Understanding water deflection could improve footwear,

kitchen appliances, outdoors supplies, aquatic sports

equipment, underwater research capabilities, and more.

Researchers can use ever-dry surfaces to achieve big

projects — provided that they first remember to think

small.

IMAGE COURTESY OF DARTHMOUTH COLLEGE

Patankar’s research ideas were derived from the water

deflection capabilities of water bug legs. These bugs are one

example of how nature can inspire next generation technology.

34 November 2015


Science or

Science Fiction?

BY AMANDA MEI

Telepathy and Mind Control

Imagine stepping into a room and catching the eye of

someone inside. You exchange no words, you give no

smile. But somehow, you both know you’re saying “hi.”

It’s like telepathy — your brains are in sync.

What if you were in India, and the other person in

France? Would brain-to-brain communication still be

possible?

According to research done by scientists in

Barcelona, Boston, and Strasbourg, the answer is yes.

The study marked the first time conscious thoughts

were transmitted directly between individuals. By

recording the brain signals of one person in India with a

computer system, converting them into electrical brain

stimulations, and relaying them to recipients in France,

the research team developed a noninvasive method of

brain-to-brain communication. The transmissions were

simple greetings: “hola” in Spanish and “ciao” in Italian.

“This represented the first time a human knew what

another was thinking in such a direct way,” said Giulio

Ruffini, CEO of Starlab Barcelona and author of this

study.

To achieve brain-to-brain communication, the team relied

on a process called synaptic transmission. Chemical

signals are transmitted between neurons through spaces

called synapses, generating electric impulses in the receiving

neurons. These impulses drive brain function for

activities including motor skills and sensory perception.

In the experiment, the non-invasive technologies electroencephalography

(EEG) and transcranial magnetic

stimulation (TMS) were used as interfaces with neuronal

synaptic signaling. EEG works with the sender of a message:

the technology uses a helmet-like device with electrodes

to record electrical activity from firing neurons

in a participant’s brain. Then, TMS takes this communication

to the recipient: the technology electrically stimulates

parts of the recipient’s brain to produce impulses

that can be perceived.

On the sending side, otherwise known as the brain

computer interface, researchers encoded the words on

a computer in binary code. The computer cued one

subject in Thiruvananthapuram, India to think about

moving either his hands for transmission of one or his

feet for zero. Then, the subject’s conscious thoughts were

recorded by EEG, decoded by a computer as a one or a

zero, and emailed to researchers in Strasbourg, France.

At the recipient computer brain interface, the EEG

PHOTO BY AYDIN AKYOL

signals received by email were converted for real time use

in TMS. This stimulation was then delivered to at least

three subjects, all healthy and between the ages of 28 and

50. TMS technology applied pulses to a recipient’s right

occipital cortex, which process visual information. Then,

with her vision darkened by a blindfold, the recipient was

able to perceive flashes of light called phosphenes in her

peripheral vision. For the binary signal one, TMS induced

phosphenes, whereas for the binary signal zero, TMS was

manipulated so that there were no visual signals. Finally,

the recipients and the research team could decode the

binary signals back into the original words.

Some previous studies have also used electrical

impulses in brain-to-brain contact, and researchers

have demonstrated human-to-rat and rat-to-rat brain

communication. In 2013, researchers at the University of

Washington induced human-to-human brain interfacing

for the first time. One man playing a video game imagined

pushing a button, causing another man in a different

room to subconsciously press the button. The results

from this experiment suggest new research directions for

noninvasive brain-to-brain communication, including

the transmission of emotions and feelings. “As we see it,

brain-to-brain interfaces are full of possibilities…. Most

of the world-changing tech innovations in mankind were

innovations of communication,” said Andrea Stocco, one

of the authors of the University of Washington paper.

Still, scientists who conduct brain-to-brain research

warn the public not to interpret brain-to-brain communication

as either telepathy or mind control. Telepathy

implies the exchange of information without any of our

sensory channels or physical interactions — we usually

imagine people sending thoughts to each other through

thin air. Scientists talk instead of hyperinteraction, or the

transmission of information from one brain to another

using non-invasive but still physical mechanisms.

As for mind control, Ruffini said he has no idea how

we could begin to achieve it. “There is no magic in our

work. It is based on hard-core science and technology

and mainly on Maxwell’s equations,” he said.

Despite what science fiction says, you cannot influence

the minds of other people or exchange thoughts with

them without both your senses and technology. People

might be 5,000 miles away or only a few steps across

the room, but unless they agree to wear helmets and

blindfolds, “hi” (or “hola” or “ciao”) is just a fantasy.

November 2015 35


UNDERGRADUATE PROFILE

GREG MEYER (MC ‘16)

CLIMBING MOUNTAINS, CONQUERING PHYSICS

BY GLORIA DEL ROSARIO CASTEÑEDA

Growing up surrounded by the beautiful landscapes of Vermont,

Greg Meyer (MC ’16) had a passion for motion. He remembers

how, as a child, he would play with sand for hours, watching it

change shape in his fingers and developing a basic intuition for

how things work. High school brought outdoor sports loaded

with terrifying thrills: kayak racing, 40-foot dives, and mountain

biking with his friends. “A basic understanding for a lot of

scientific things can come from just experiencing them and just

playing with them,” Meyer said. His early passion for hands-on

encounters would propel him into physics research at Yale, CERN,

and the National Institute of Standards and Technology.

When he started at Yale, Meyer’s dedication to FOOT took

root easily. He participated as a freshman and would continue to

be involved throughout his four years. At first, he took courses

spanning a smorgasbord of scientific disciplines — engineering,

neuroscience, and physics. Meyer soon settled on physics: It was

the most fun, an extension of his childhood curiosity about how

things work.

During his first two summers as an undergraduate, Meyer

conducted research in high-energy particle physics at CERN,

in Meyrin, Switzerland. While working at the laboratory, Meyer

lived in a small French town called St. Genis-Pouilly at the base

of the Jura Mountains. Living in France, it turned out, was not

only more affordable than Switzerland, but also presented some

interesting options for hiking.

While at CERN, Meyer conducted his thesis research on

supersymmetry — a system of mathematical predictions used to

anticipate and fix problems that arise with the Standard Model.

The Standard Model represents our current understanding of how

physical matter is made. Supersymmetry suggests that particles

already known in the Standard Model have partner particles. The

properties of these partner particles lead to cancellations that

could fix problems with the Standard Model, such as the question

of why the Higgs boson has the mass that it does.

In his thesis research, Meyer was searching for the stop quark,

the partner of the top quark in the Standard Model. To investigate

the stop quark’s existence, researchers examine the products

of decay events — processes in which unstable particles are

converted into energy or smaller masses. They look for events that

could be attributed to the quark. Still, many more trials are needed

to see sufficient statistical evidence that the stop quark exists. And

if it does, it would be rare.

Although Meyer no longer works with the supersymmetry

researchers at CERN, their project is ongoing: Another trial at a

higher energy is currently underway in search of better evidence

IMAGE COURTESY OF GREG MEYER

During a trip to Volcanoes National Park in Hawaii, Meyer hiked

to a volcano crater.

for the stop quark and for supersymmetry. Meyer’s thesis predicts

that if support for supersymmetry is not found during this higherenergy

run, the evidence may actually contradict the theory

behind supersymmetry. Thus far, supersymmetry has been elusive

even to the most dedicated physicists, and all evidence in support

of the theory has been indirect.

This past summer, Meyer sought a more hands-on project to

continue his work in physics. At the National Institute of Standards

and Technology (NIST) in Colorado, his research concerned

atomic clocks, which use electromagnetic radiation from atomic

transitions to keep track of time. Meyer created a computerprogrammed

device to account for the effects of magnetic fields

on the clocks at NIST. This research reprised a familiar theme —

his determination to put his talents towards better understanding

and experiencing the world.

Meyer continues to expand his knowledge of physics at Yale. As

a junior, he joined the Drop Team, which conducts microgravity

research. As he considers the future, he weighs his many areas

of interest, and looks forward to attending graduate school in

physics.

Of course, Meyer also makes time for outdoor activities —

Ultimate Frisbee, mountain biking, FOOT, and slacklining. “I love

doing physics and thinking, but sometimes it’s just nice to let your

cerebellum take over,” he said.

He did have one more thing to add, summing up his lively

nature: “Shout-out to my FOOTies!”

36 November 2015


ALUMNI PROFILE

MICHELE SWANSON (YC ‘82)

MICROBIOLOGIST AND MENTOR

BY PATRICK DEMKOWICZ

As a young woman, Michele Swanson ’82 did not anticipate

attending an Ivy League college. “I was one of six kids growing

up in Ohio. My dad was the first in his family to go to college,”

she said. Now, Swanson is a professor of microbiology at the

University of Michigan Medical School and a leader in the

American Society for Microbiology.

Swanson credits her success to the mentors who saw her

potential as a young adult. Following their examples, she now

mentors and advocates for young scientists herself. She believes

in the importance of public education on topics in science, and

even co-hosts a podcast to spread knowledge of microbiology.

Swanson’s journey to New Haven began when she met the

Yale field hockey coach at a summer camp in Michigan. Soon,

she was playing varsity field hockey and softball at Yale. She

also held a campus job and served as a freshman counselor for

Davenport College. Her senior year, she took an inspirational

class in developmental biology. “Professor John Trinkaus taught

with such passion that I really got interested in thinking like an

experimentalist,” Swanson said. Although it was too late for her

to get involved in research on campus, she secured a job as a lab

technician at Rockefeller University upon graduation.

Swanson reflects warmly on her early years in the laboratory.

At the time, she was content to assist graduate students, but

realized many of their exciting scientific discussions were

beyond her reach. She remembers the day when she asked her

laboratory head, Samuel C. Silverstein, for a recommendation

letter to apply for master’s degree programs, only to have her

initial request denied. “Instead, he sat me down and said, ‘I want

you to apply for PhD programs. I want you to think big and get

the best training you can at each stage of your career,’” she said.

Shortly thereafter, Swanson began graduate school at Columbia

University before moving with her husband, also a graduate

student, to Harvard. There, she would earn a PhD in genetics

with Fred Winston while also starting a family.

In 1986, Harvard was a difficult place to be a mother and

scientist. Swanson recalled the social pressure she faced as she

tried to excel at the lab bench while raising two children: “People

have a tendency to measure how deeply you are committed to

your research by the number of hours you spend at the lab. Any

working parent knows we have to care twice as much about our

careers to put in the same number of hours.”

In spite of the obstacles she faced as a mother, Swanson

persisted, with encouragement from her thesis advisor and other

faculty. After taking a year off to spend time with her children,

Swanson completed her postdoctoral training and was recruited


to the faculty at the University of Michigan, which jointly hired

her husband. There, she began a research program on how

Legionella pneumophila, the bacterium that causes Legionnaire’s

disease, thrives in immune cells. Her lab continues to make

significant contributions to our understanding of microbial

infections and immunity.

Remembering her own mentors, Swanson works to support

other young scientists. To this end, she has served in many

leadership positions at the University of Michigan. She is

currently the director of the Office of Postdoctoral Studies. She

has also been a member of the President’s Advisory Commission

on Women’s Issues, which develops policies, practices, and

procedures to enhance gender and racial equity. “I want to make

sure that other talented people, and women in particular, have

the same opportunities I had,” Swanson said.

Apart from her work at the University of Michigan, Swanson

is involved in the American Society for Microbiology, which

publishes journals, hosts professional events, and guides public

outreach efforts to advance the microbial sciences. She was

recently appointed as chair of the Board of Governors of the

American Academy for Microbiology. She also co-hosts a podcast

entitled This Week in Microbiology. The podcast has aired since

2011, garnering 1.2 million downloads over the course of 111

episodes. Swanson sees this podcast as an important effort to

educate the public on how microbes influence our lives.

Swanson believes that her experience at Yale reinforced in her

the values she lives by today, especially her desire to give back.

“I really believe the culture

at Yale strives to instill

that spirit in the community,

that we’re privileged to

be there but also have an

obligation to step up and

take leadership roles and

give back,” she said. Swanson

models these values

through her mentorship,

leadership, and commitment

to public outreach.

Her path to Yale and academia

shows that the difference

between chance

and fate is often decided by

one’s own passion and persistence.

November 2015

IMAGE COURTESY OF MICHELE SWANSON

Swanson is a professor at the University

of Michigan Medical School.


37


q a

&

BY ISABEL WOLFE

The satisfying crunch that accompanies

the first bite into a crisp apple is a quintessential

fall experience. Although we may

not realize it, this crunch affects the delicious

flavor we perceive.

Do we eat with our ears? Perhaps. Recent

research from Oxford University explores

how sounds impact our perception and enjoyment

of flavor.

Scientists have recognized that flavor is

a multi-sensory experience in which taste,

appearance, scent, texture, and sound are

all important ingredients. Indeed, Yale epidemiology

and psychology professor Lawrence

Marks acknowledges that it is difficult

to separate the components influencing flavor.

“The different [sensory] systems are always

integrating information,” Marks said.

Sounds perceived by the ear are converted

to electrical signals and are processed in the

Do you eat with your ears?

IMAGE COURTESY OF FREESTOCKPHOTOS

Experience the satisfying crunch of a

fall apple. Research shows that sound may

influence taste.

auditory cortex of the brain. However, scientists

are unsure exactly how the brain associates

these sensory signals with flavor.

In recent research, Oxford psychology

professor Charles Spence investigated this

phenomenon. Study participants ate and

described the taste of uniformly-flavored

chips while listening to crunching sounds.

Surprisingly, 75 percent thought the chips

tasted differently depending on which

sounds were played. When the volume of

crunching sounds increased, participants

rated potato chips as crispier and fresher.

Sounds produced by “quiet” foods and

drinks can also affect the perception of flavor

— participants reported that soda tasted

better when the volume and frequency

of bubbles was increased to produce a more

rapid fizzing sound.

So, the next time your mother tells you

to chew more softly, tell her the apple tastes

better when you make noise!

BY SUZANNE XU

Organisms have evolved to possess a

wide range of useful abilities: flight, poison

production, and even light emission.

Although humans never evolved the necessary

mechanisms to glow themselves,

some bioluminescent species can in fact

emit their own light. The trick? A specific

type of chemical reaction, which happens

to have many practical applications.

The basic mechanism for bioluminescence

is the same for most glowing species.

An enzyme, generically called luciferase,

interacts with luciferin, a molecule

that organisms may ingest or produce

themselves. This interaction yields two

products: a substance called oxyluciferin

and a photon of light, which can be observed

as glow.

Not all creatures stop there. Crystal jellies,

for example, emit photons of blue

light that are absorbed by their own green

How do some organisms glow in the dark?

fluorescent proteins and are emitted back

at a lower wavelength. This re-emission of

light produces a secondary type of glow

called biofluorescence.

IMAGE COURTESY OF WIKIMEDIA COMMONS

The fungi Panellus stipticus is an example

of a bioluminescent species.

To glow in the dark may seem impressive

in itself, but bioluminescence has

many practical uses as well. Approximately

80 percent of bioluminescent species

live in the deep sea, where they may glow

to attract prey or to distract predators.

Above water, species can use light to entice

mates or to make themselves seem larger

to predators. Bioluminescence also has applications

in the laboratory. For example,

professor Vincent Pieribone at the Yale

School of Medicine works on bioluminescent

methods to study action potentials in

neurons, which enable brain cells to communicate

with one another. He hopes that

these techniques will help scientists study

neural pathways in living subjects.

The next time you see a firefly or jellyfish

glowing in the dark, be sure to appreciate

the chemical processes that give them this

special talent.

38 November 2015


50 Whitney Ave NEW HAVEN, CT 06510 (203) 787-0400

865 Whalley Ave NEW HAVEN, CT 06525 (203) 389-5435

ONLINE ORDERS also available at SUBWAY.COM

WE DELIVER for orders OVER $25

12/31/2015

12/31/2015

12/31/2015

12/31/2015 12/31/2015 12/31/2015

More magazines by this user
Similar magazines