04.12.2012 Views

The SRA Symposium - College of Medicine

The SRA Symposium - College of Medicine

The SRA Symposium - College of Medicine

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Papers<br />

<strong>The</strong> more troubling question is what impact, if any, does such mandatory training have on future<br />

behaviors and decision-making? Do our trainees make more ethical choices? Are we creating and<br />

supporting an institutional culture that encourages, nurtures and enables ethical science? Or is it<br />

an illusion?<br />

I like to think that we work hard to create educational and pr<strong>of</strong>essional venues, which support “doing<br />

the right thing because it is simply the right thing to do.” Yet sometimes perception and reality<br />

are two separate things.<br />

As Fuchs and Macrina (2005) suggest, once we enter the realm <strong>of</strong> moral reasoning, we base judgments<br />

on what we ought to do and we assume that this is the right thing. Yet, acting morally is not<br />

always akin to acting legally. History is filled with indiscretions that, at the time, were legal but today<br />

would be considered highly unethical—if not outright unlawful. Fuchs and Macrina state that,<br />

in order to avoid repeating these historical breaches, scientists must strive to carefully examine the<br />

moral dimensions <strong>of</strong> current research practices. It is no longer a matter <strong>of</strong> abiding by the regulations.<br />

<strong>The</strong> regulations do not always cover every nuance, and—in the opinion <strong>of</strong> some—regulators<br />

are suspect at best because <strong>of</strong> questionable political and/or religious affiliations (Lower, 2005,<br />

Sweet, 2002, Crewdson, 1995). This is illustrated not only by the Gallo case orchestrated under the<br />

Reagan administration’s watch, but most certainly by the ongoing interference <strong>of</strong> the current Bush<br />

administration as well.<br />

<strong>The</strong> public trust is fragile. Once shattered it is difficult to repair. Perception is the key—sometimes<br />

it is the mere hint <strong>of</strong> a possibility <strong>of</strong> wrongdoing that can be the catalyst for disaster. Hoey (2003)<br />

cites that the problem in the Gelsinger case was not one <strong>of</strong> uninformed consent, misleading protocols,<br />

publication gag clauses, or the like. Rather, the issue became one <strong>of</strong> public perception. Once<br />

the family filed a suit against the University <strong>of</strong> Pennsylvania (UPenn), it was learned that James<br />

Wilson, the lead investigator, was also the president and major shareholder <strong>of</strong> the private company<br />

that held patents for the procedure in question and funded the research. It also was discovered<br />

that the university and some members <strong>of</strong> the board <strong>of</strong> governors also owned stock in the firm.<br />

Although, Hoey (2003) asserts that these factors had no direct bearing on Jesse Gelsinger’s death,<br />

UPenn quickly settled the suit and Wilson resigned his university post. UPenn went a step further<br />

by instituting strict guidelines for clinical research that would prohibit faculty from participating<br />

if they or close family members have a material financial interest in a private company whose<br />

product(s) they are evaluating. Many other institutions have followed with similar guidelines.<br />

Conflict <strong>of</strong> interest and commitment become a slippery slope for many researchers in search <strong>of</strong><br />

funding. Guterman (2005) reports that over the past twenty years the proportion <strong>of</strong> research studies<br />

funded publicly has decreased significantly, while those supported privately have increased.<br />

This seems especially true in occupational and environmental health, where the field is dedicated<br />

to studying dangers to the public’s health and safety from the workplace and/or environment.<br />

Academic scientists in this field know that business interests increasingly drive research agendas<br />

(Guterman, 2005). <strong>The</strong> government seems to give this work low priority, and, so, scientists are<br />

forced to look to the private sector for research funding. How does the responsible scientist balance<br />

the paycheck with the findings that their sponsor’s product is actually a menace to public<br />

health and welfare? Guterman quotes Daniel T. Teitelbaum, a Denver doctor specializing in medical<br />

toxicology and occupational epidemiology, who says, “Industry doesn’t give you money to do<br />

research. Industry gives you money to do research that favors them.”<br />

<strong>The</strong>refore are corporate sponsors the villains? <strong>The</strong> money has to come from somewhere, but at<br />

what price to the public trust? Many feel journal editors are equally culpable. Guterman argues<br />

that some journals in occupational and environmental health do not require authors to reveal<br />

their funding source or any other possible conflicts <strong>of</strong> interest. Shuchman and Wilkes (1997) also<br />

2005 <strong>Symposium</strong> Proceedings Book 237

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!