Medical ReportingHelping Reporters Play the Medical Numbers GameA journalist reminds us about how tricky putting ‘facts’ into perspective can be.By Lewis CopeAs medical reporters, we laugh atthe tale about the drug-treatmentresearcher who said, “Thirtythreepercent were cured, 33 percentdied—and the third mouse got away.”We know that the more patients (ormice) in a study, the better. Big numbershelp make a study’s findings “statisticallysignificant.” This term simplymeans it’s unlikely that the study’s keystatistical findings are due to chancealone. But merely obtaining statisticalsignificance doesn’t prove that thestudy’s conclusions are medically significantor correct. So, as reporters, wemust probe further and be alert for thenumbers games and other things thatmight lead us awry.Two journalistic instincts—healthyskepticism and good questioning—come in handy on the medical beat.And, if you don’t report in this area, apeek into what we do will make you amore astute consumer of medicalnews—and a more careful viewer ofmedical claims on the Internet.Hints About MedicalCoverageWhat follows are thoughts I have aboutthings that scientists and reporters mustconsider.Remember the rooster who thoughtthat his crowing made the sun rise?Even with impressive numbers, associationdoesn’t prove causation. A virusfound in a patient’s body might bean innocent bystander, rather than thecause of the illness. A chemical in atown’s water supply might not havecaused illnesses there, either. Morestudy and laboratory work are necessaryto certify cause-and-effect links.Let me cite one current case in whichprecisely this caution is needed. Newsreports have speculated about whethersome childhood immunizations mightbe triggering many cases of autism. Asa reporter, this has the sound of coincidence,not causation. Autism tendsto appear in children about the timethey get a lot of their vaccines. Is additionalstudy warranted? Probably. Butthere is concern that in the meantimeparents will delay having children immunizedagainst measles and otherdangerous diseases. In a lot of thepress reports, the missing figures arethe tolls these childhood diseases tookbefore vaccines were available.Always take care in reporting claimsof cures. The snake-oil salesman said,“You can suffer from the common coldfor seven days, or take my drug and getwell in one week.” Patients with someother illness might be improving simplybecause their disease has run itsnatural course, not because of the experimentaldrug they’re taking. Care isneeded to sort claims made about whathas cured a particular ailment.In covering stories about diseaseoutbreaks and patterns, be cautiousabout case numbers. There was a storyrecently about how Lyme disease caseshave soared in some states. The articlecited statistics, but buried some importantcautions. Improved diagnosis andreporting of Lyme cases might be behindmuch of this increase. The journalisticantidote: Refer to such numbersas reported cases and explain whyyou are doing so.Sort through when you might bedealing with the power of suggestion.A large federal study examining qualityof life issues recently concluded thathormone therapy for menopausedoesn’t benefit women in many of theways long taken for granted. How couldso many women, for so long, haveconcluded that the hormone therapymade them more energetic and lessdepressed? A patient who wants andexpects to see a drug work may mistakenlyattribute all good feelings to themedication.The “gold standard” of clinical researchis a double-blind, placebo-controlledstudy, with patients randomlyassigned to either a treatment group orto a comparison (no treatment) group.Blinding means that, until the study iscompleted, neither the researchers northe patients know who is getting theexperimental treatment and who isgetting only a placebo. This keeps expectationsand hopes from coloringreported results. Less rigorous studiesstill might be important, but findingsfrom them require more questioningby journalists. When their findings arereported as news, the absence of the“gold standard” should be stated.Side effects are a big part of medicalcoverage and need to be handled properly.Some drugs have been taken offthe market after serious side effectswere discovered, long after the originalstudies found no problems. A seriousside effect that strikes only one inevery 10,000 patients might have beenmissed in the original studies involvinga few thousand patients. The problembecomes apparent only after the drugis marketed and then taken by more64 <strong>Nieman</strong> Reports / Summer <strong>2003</strong>
Medical Reportingand more people.There is a danger in citing averages.Remember: People drown in lakes withan average depth of four feet when it’snine feet deep in the middle. We heara claim that the average person in aweight-loss study lost 50 pounds. Butmaybe there were only three people inthe study. A 400-pound man took off150 pounds, but the other two patientscouldn’t shed a single pound. Still interesting.But the average figure didn’ttell you the story.When a reporter does a story abouta new medical treatment, find out whatit costs and whether the cost will becovered by most insurance plans. I’veanswered many phone calls from readersafter reporting about some newmedical treatment and forgetting todeal with the dollar figures. In reportingon research, cost estimates are importantto our readers and viewers.Some treatments might be so expensivethat they are unlikely ever to seewidespread use.Remind readers, listeners and viewersabout the certainty of some uncertainty.Experts keep changing theirminds about whether we should cutback on fats or carbs to keep our waistlinestrim. In the eyes of some, theseand other flip-flops give science a badname. Actually, this is science workingjust as it is supposed to work, and ithelps if we, as reporters, include this inour stories.Readers, listeners and viewersshould also know that science looks atthe statistical probability of what’s true.Few, if any, new treatments would everreach patients if proof-positive wererequired. Many, many <strong>live</strong>s would belost. Science builds on old researchfindings in seeking new advances. Inthe process, old ideas are continuallyretested and modified if necessary.The Wisdom of GoodMedical JournalistsWise medical journalists tell their viewersand readers about the degree ofuncertainty involved in what they arereporting. They use more words like“evidence indicates” and “concludes”and fewer words like “proof.” I onlywish I had been wise more often duringmy career.Keep in mind that when a study’sfindings agree with other scientific studiesand knowledge, that’s a big plus.When they don’t fit with what’s alreadyknown (or thought to be known), cautionflags must be raised. The burdenrests with those seeking to changemedical dogma. But when that burdenis met, there is a hell of a story to tell.Big numbers aren’t always neededto tell important stories. Small studiescan open big research areas. It’s justthat reporting on these smaller studiesrequire “early studies” warning labels.On the other hand, even the first fewcases of a newly recognized disease(such as the mysterious respiratory illnesscalled SARS) can be a concern. Asingle confirmed case of smallpox couldbe a looming disaster, signaling a newterrorist threat.When we hear of a high number ofcancer cases clustered in a neighborhoodor town, more study might beneeded, not panic spread. Statistically,there might be many more cases thanexpected. But wait. This could be happeningby chance alone; with so manycommunities across our nation, a fewwill have more than their share of cancercases. And with cancer, we hearabout how experimental early-detectiontests might find very tiny tumors.But is it early enough to make a difference?Or is treatment then the rightapproach? Extra caution, too, is neededin interpreting what treatment tests onlab animals tell us.At this time, when so much medicalinformation, scientific findings, andstatistical claims readily accessible onthe Internet, there is even more of anobligation on reporters to help consumersevaluate the source and considerpossible bias. Reporters do thisby always rigorously looking for thenumbers and thinking about the pointsI’ve raised above about how figurescan mislead.Medical reporters don’t have toknow scientific answers. Their job obligatesthem to ask the right questions.And it can be even easier than that.Frequently, I’ve ended an interview byasking, “What’s the question that IWhat makes a goodmedical reporter?The late Victor Cohn, a formerscience editor of The WashingtonPost, said: “A good medicalreporter is, first of all, a reporterafter a story, not just a medicalstory but an interesting and importantstory. A good medicalreporter also has fun, fun talkingto some of the world’s most dedicatedand interesting people, funwriting copy that zings and capturesthe reader, fun that injectspassion into the job, for it is a jobthat needs passion. A good medicalreporter reports for people,not for doctors, not for scientists,not even for editors or newsdirectors. A good medical reporteris privileged to contributeto the great fabric of newsthat democracy requires. Thereis no more important job thangiving people the informationthey need to work, to survive, toenjoy life, to participate in andmaintain a free and democraticsociety.” ■Source: Council for the Advancementof Science Writing.should have asked but didn’t?”I’ve often been surprised by howmuch I then learned. ■Lewis Cope was a science reporterfor the Star Tribune (Minn.) for 29years and is a former president ofthe National Association of ScienceWriters. He is coauthor, with the lateVictor Cohn, of the second edition of“News & Numbers: A Guide to ReportingStatistical Claims and Controversiesin Health and OtherFields” (Iowa State Press, 2001). Heis a board member of the Council forthe Advancement of Science Writing.lcope@mn.rr.com<strong>Nieman</strong> Reports / Summer <strong>2003</strong> 65