described, from the viewpoint that interests us, by a definite number of parameters. In thesimplest cases this set is finite. To illustrate: the set of the studied characteristics of abiological individual (stature, weight, volume, etc); the set of the coordinates <strong>and</strong> impulses ofa certain number of the particles of a physical system. In more complicated cases the set ofthe parameters is infinite, as it occurs for example for the field of the velocities of a turbulentcurrent of liquid; for the field of pressure or temperature of the Earth’s atmosphere, etc. Theexceptional complexity of the processes taking part in such systems compels us to applystatistical methods of research. In following the statistical approach, we consider eachobserved state of the system as a r<strong>and</strong>om representative, or specimen selected by chancefrom an abstract general population of the states possible under identical general conditions.We assume that over this general population the r<strong>and</strong>om parameters can possess somedistribution of probabilities corresponding to certain conditions usually formulated as ahypothesis. In the simplest cases this will be a multivariate distribution; <strong>and</strong>, for an infinitenumber of parameters, a distribution of a r<strong>and</strong>om function or of a r<strong>and</strong>om field in afunctional space.The observed data can be either the registered states of a more or less vast population ofspecimens of the given system (the states comprising a sample from the general population),or only some mean (space or temporal) characteristics of the states of the system. Theinterrelations between the empirical material <strong>and</strong> the theoretically allowed distribution of thegeneral population constitute the main subject of mathematical statistics. Included problemsare, for example, the fullest <strong>and</strong> most precise reconstruction of the law of distribution of thegeneral population, given the sample; an adequate check of various hypotheses concerningthis population; an approximate estimation of the parameters <strong>and</strong> of the theoretical meanscharacterizing the theoretical distribution; an interpretation of various relations <strong>and</strong>dependences observed in samples; <strong>and</strong> many other practically <strong>and</strong> theoretically vital pointsoriginating in the applications of the statistical method.I shall now go over to characterize separate prominent achievements of <strong>Soviet</strong> scholars insolving the most important problems of mathematical statistics.1. The Theory of the Curves of Distribution. Correlation TheoryThe limit theorems of the theory of probability which determine the applicability, undervery general conditions, of the normal law to sums of independent or almost independentvariables, ensure the suitability of the theoretical model of a normally distributed populationto many concrete problems. Already the early statistical investigations made by Quetelet <strong>and</strong>then widely developed by the British Galton – Pearson school ascertained that the normal lawwas rather broadly applicable to biological populations. At the same time, however, it wasalso established that considerable deviations from the usual picture of the normaldistribution, viz., an appreciable skewness <strong>and</strong> an excess of some empirically observeddistributions, were also possible. To describe mathematically the distributions of such a type,Pearson introduced a system of distribution functions which were solutions of the differentialequationdy x − a(1/y) =2dx b + b x + b0 1 2x<strong>and</strong> worked out in detail the methods of determining the parameters of the appropriate curvesgiven the empirical data. It occurred that the Pearsonian curves, very diverse in form, wereapplicable for interpolation in a broad class of cases. However, for a long time theirstochastic nature was left unascertained; Pearson’s own substantiation that he provided insome of his writings was patently unsound <strong>and</strong> led to just criticisms (Yastremsky [1]). The
problem remained unsolved until Markov [1] showed how it was possible to obtain limitingdistributions of some of the Pearsonian types by considering an urn pattern of dependenttrials (that of an added ball). […] Pólya (1930), who apparently did not know Markov’sfindings, minutely studied this scheme of contagion, as he called it. Bernstein [34],Savkevich [1] <strong>and</strong> Shepelevsky [2] considered some of its generalizations.Kolmogorov [32] outlined another approach to theoretically justifying the Pearsoniancurves. He obtained their different types as stationary distributions that set in after a longtime in a temporal stochastic process under some assumptions about the mean velocity <strong>and</strong>variance of the alteration of the evolving system’s r<strong>and</strong>om parameter. Bernstein [27] provedthat under certain conditions such a stationary distribution exists. Ambartsumian [1; 2]investigated in detail particular cases of stochastic processes leading to the main Pearsoniancurves.Romanovsky [20] generalized the Pearsonian curves to orthogonal series similar to thewell-known Gram – Charlier series, Bernstein [12; 41] also studied another stochastic patternadmitting in may practically important cases a very concrete interpretation <strong>and</strong> leading tosome transformations of the normal distribution.Still more considerable are the achievements of <strong>Soviet</strong> mathematicians in the domain ofcorrelation theory which already has vast practical applications. The works of Bernstein [13]<strong>and</strong> Khinchin [8] on the limit theorems for sums of r<strong>and</strong>om variables ensured a solidtheoretical foundation for the theory of normal correlation. Bernstein [41] discoveredinteresting applications of these propositions to the case of hereditary transmission ofpolymeric indications (depending on a large number of genes). His work led to a theoreticalexplanation of the law of hereditary regression empirically established by Galton.Bernstein’s research [15; 16] into the geometric foundations of correlation theory are ofparamount importance. He classified various surfaces of correlation according to simplegeometric principles. If the change of one of the r<strong>and</strong>om variables only results in a translationof the conditional law of the distribution (of the density), the correlation is called firm{French original: dure}. Normal correlation is obviously firm, <strong>and</strong> a firm <strong>and</strong> perfectcorrelation is always normal. If all the conditional laws of one variable corresponding tovarious values of the other one can be obtained by contracting (or exp<strong>and</strong>ing) one <strong>and</strong> thesame curve, the correlation is elastic. A more general type of isogeneous correlation is suchthat the elastic deformation of the conditional law is at the same time accompanied by atranslation. Bernstein derived a differential equation that enabled him to determine all thetypes of the firm correlation <strong>and</strong> some particular cases of the isogeneous type. Sarmanov [1;2] definitively completed this extremely elegant theory. The surfaces of isogeneouscorrelation are represented asF(x; y) = [Dx 2 y 2 + 2Gx 2 y + 2Exy 2 + Ax 2 + By 2 + 2Hxy + I] c .In some cases the conditional laws are expressed by the Pearson curves. In the general caseisogeneous correlation is heteroscedastic (with a variable conditional variance). Theregression curve of y on x has equationy = –GxDx22+ Hx + I+ Ex + F<strong>and</strong> a similar equation exists for the regression of x on y.Obukhov [1; 2] developed the theory of correlation for r<strong>and</strong>om vectors first considered byHotelling (1936). It is widely applied in meteorological <strong>and</strong> geophysical problems, in thetheory of turbulent currents <strong>and</strong> in other fields. Making use of tensor methods, he was thefirst to offer an exposition in an invariant form. He introduced tensors of regression <strong>and</strong> of
- Page 4 and 5:
[3] I bear in mind the well-known p
- Page 6 and 7:
successes of physical statistics. B
- Page 8 and 9:
classes of independent facts whose
- Page 10 and 11:
distribution is a corollary of the
- Page 12 and 13:
examine in the first place the curv
- Page 14 and 15:
12. According to Bortkiewicz’ ter
- Page 16 and 17:
generality, the similarities taking
- Page 18 and 19: one on another, as well as the corr
- Page 20 and 21: is inapplicable because the right s
- Page 22 and 23: Instead, Slutsky introduced new not
- Page 24 and 25: abandoned in August 1936, but it is
- Page 26 and 27: last decades, mathematicians more o
- Page 28 and 29: charged with making the leading ple
- Page 30 and 31: motion and a number of others) are
- Page 32 and 33: phenomena. It is self-evident that
- Page 34 and 35: Such new demands were formulated in
- Page 36 and 37: The addition of independent random
- Page 38 and 39: automatic lathes, etc. Here, the ma
- Page 40 and 41: 11. Kolmogorov, A.N. Grundbegriffe
- Page 42 and 43: period 1 and remained, until the ap
- Page 44 and 45: of the analytical tool rather than
- Page 46 and 47: with probability approaching unity,
- Page 48 and 49: logic. The ensuing vagueness in his
- Page 50 and 51: 2. Gnedenko, B.V. (1949), On Lobach
- Page 52 and 53: will be sufficient, although not ne
- Page 54 and 55: nlimk = 1P(| k (n) - m k (n) | > H
- Page 56 and 57: favorite classical issue as the gam
- Page 58 and 59: and some quite definite (not depend
- Page 60 and 61: influenced by a construction that a
- Page 62 and 63: P ij (1) = p ij (1) , P ij (t) =kP
- Page 64 and 65: 4.2d. Bebutov [1; 2] as well as Kry
- Page 66 and 67: are yet no limit theorems correspon
- Page 70 and 71: conditional variance and determined
- Page 72 and 73: Romanovsky [45] and Kolmogorov [46]
- Page 74 and 75: Let S be the general population wit
- Page 76 and 77: Part 1. Russian/Soviet AuthorsAmbar
- Page 78 and 79: 2. On necessary and sufficient cond
- Page 80 and 81: Gnedenko, B.V., Groshev, A.V. 1. On
- Page 82 and 83: 52. ( (Mathematical Principl
- Page 84 and 85: Kozuliaev, P.A. 1. Sur la répartit
- Page 86 and 87: Obukhov, A.M. 1. Normal correlation
- Page 88 and 89: 30. Généralisations d’un théor
- Page 90 and 91: 22. Alcune applicazioni dei coeffic
- Page 92 and 93: 10. A.N. Kolmogorov. The Theory of
- Page 94 and 95: Kuznetsov, Stratonovich & Tikhonov
- Page 96 and 97: In the homogeneous case H s t = H t
- Page 98 and 99: to such a generalization. He only s
- Page 100 and 101: In the particular case of a charact
- Page 102 and 103: as it is usual for the modern theor
- Page 104 and 105: 1. {The second reference to Pugache
- Page 106 and 107: Smirnov, Romanovsky and others made
- Page 108 and 109: determined the precise asymptotic c
- Page 110 and 111: for finite values of N, M and R 2 .
- Page 112 and 113: Mikhalevich’s findings by far exc
- Page 114 and 115: Uch. Zap. = Uchenye ZapiskiUkr = Uk
- Page 116 and 117: Khinchin, A.Ya. 43. Math. Ann. 101,
- Page 118 and 119:
7. DAN 115, 1957, 49 - 52.Pinsker,
- Page 120 and 121:
Anderson, T.W., Darling, D.A. (1952
- Page 122 and 123:
Statistical problems in radio engin
- Page 124 and 125:
observations for its power with reg
- Page 126 and 127:
securing against mistakes (A.N. Kry
- Page 128 and 129:
of the others, then its distributio
- Page 130 and 131:
In Kiev, in the 1930s, N.M. Krylov