11.07.2015 Views

2DkcTXceO

2DkcTXceO

2DkcTXceO

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

S.E. Fienberg 14313.2 The National Halothane StudyI take as a point of departure The National Halothane Study (Bunker et al.,1969). This was an investigation carried out under the auspices of the NationalResearch Council. Unlike most NRC studies, it involved data collection anddata analysis based on new methodology. The controversy about the safety ofthe anesthetic halothane was the result of a series of published cases involvingdeaths following the use of halothane in surgery. Mosteller offers the followingexample:“A healthy young woman accidentally slashed her wrists on a brokenwindowpane and was rushed to the hospital. Surgery was performedusing the anesthetic halothane with results that led everyone to believethat the outcome of the treatment was satisfactory, but a few days laterthe patient died. The cause was traced to massive hepatic necrosis —so many of her liver cells died that life could not be sustained. Suchoutcomes are very rare, especially in healthy young people.” (Mosteller,2010, p. 69)The NRC Halothane Committee collected data from 50,000 hospitalrecords that were arrayed in the form of a very large, sparse multi-way contingencytable, for 34 hospitals, 5 anesthetics, 5 years, 2 genders, 5 age groups,7 risk levels, type of operation, etc., and of course survival. There were 17,000deaths. A sample of 25 cases per hospital to estimate the denominator madeup the residual 33,000 cases.When we say the data are sparse we are talking about cells in a contingencytable with an average count less than 1! The common wisdom of the day, backin the 1960s, was that to analyze contingency tables, one needed cell countsof 5 or more, and zeros in particular were an anathema. You may even haveread such advice in recent papers and books.The many statisticians involved in the halothane study brought a numberof standard and new statistical ideas to bear on this problem. One of these wasthe use of log-linear models, work done largely by Yvonne Bishop, who was agraduate student at the time in the Department of Statistics at Harvard. Theprimary theory she relied upon was at that time a somewhat obscure paperby an Englishman named Birch (1963), whose theorem on the existence ofmaximum likelihood estimates assumed that all cell counts are positive. Butshe needed a way to actually do the computations, at a time when we werestill carrying boxes of punched cards to the computer center to run batchprograms!The simple version of the story — see Fienberg (2011) for a more technicalaccount — was that Yvonne used Birch’s results (ignoring the condition onpositive cell counts) to derive connections between log-linear and logit models,and she computed maximum likelihood estimates (MLEs) using a version ofthe method of iterative proportional fitting (IPF), developed by Deming and

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!