1 Oscar Sheynin History of Statistics Berlin, 2012 ISBN 978-3 ...

sheynin.de
  • No tags were found...

1 Oscar Sheynin History of Statistics Berlin, 2012 ISBN 978-3 ...

contrasted to theoretical constructions. Thus, Proctor (1872) plotted 324thousand stars on his charts attempting to leave aside any theories on thestructure of the stellar system, but the development of astronomy provedhim wrong.Calculation and adjustment of observations, their reasonable comparisonhas always been important for astronomy. Here, I again ought to mention, inthe first place, Newcomb. Benjamin (1910) and many other commentatorsstated that he had to process more than 62 thousand observations of the Sunand the planets and that his work included a complete revision of theconstants of astronomy. I add that he discussed and compared observationsobtained at the main observatories of the world and that he hardly had anyaids except for logarithmic tables. In addition, he published some pertinenttheoretical studies. He was of course unable to avoid the perennial problemof the deviating observations. At first he regarded them with suspicion, then(1895, p.186), however, became more tolerant. If a series of observationsdid not obey the normal law, Newcomb (1896, p. 43) preferred to assign asmaller weight to the remote observations, or, in case of asymmetric series,to choose the median instead of the arithmetic mean. He had not mentionedCournot (§ 10.3-3), and, in two memoirs published at the same time, he(1897a; 1897b) called the median by two (!) other, nowadays forgotten,terms.Mendeleev (§ 10.9.3) objected to combining different summaries ofobservations; Newcomb, however, had to do it repeatedly, and in such caseshe (1872), hardly managing without subjective considerations, assignedweights to individual astronomical catalogues depending on their systematicerrors. Interestingly enough, he then repeated such adjustments withweights, depending on random errors.After determining that the normal law cannot describe some astronomicalobservations made under unfavourable conditions, Newcomb (1886)proposed for them (and, mistakenly, for all astronomical observationsaltogether) a generalized law, a mixture of normal laws with differingmeasures of precision occurring with certain probabilities. The measure ofprecision thus became a discrete random variable, and the parameters of theproposed density had to be selected subjectively. He noted that his densityled to the choice of a generalized arithmetic mean with weights decreasingtowards the tails of the variational series which was hardly better than theordinary arithmetic mean (§ 6.3.1).He had also introduced some simplifications, and later authors noted thatthey led to the choice of the location parameter by the principle of maximumlikelihood. Newcomb hardly knew that his mixture of normal laws was notnormal (Eddington 1933, p. 277). In turn, two authors generalizedNewcomb’s law, but their work was of little practical importance.Like Mendeleev (§ 10.9.3), Newcomb (1897b, p. 165) thought that thediscrepancy between two empirical magnitudes was essential if it exceededthe sum of the two appropriate probable errors, and it seems that this rigidtest had been widely accepted in natural sciences. Here is Markov’s relevantpronouncement from a rare source (Sheynin 1990b, pp. 453 – 454): heLike[d] very much Bredikhin’s rule according to which ‘in order to admitthe reality of a computed quantity, it should at least twice numerically105

More magazines by this user
Similar magazines