Darmois-Skitovic theorem and its proof
Darmois-Skitovic theorem and its proof
Darmois-Skitovic theorem and its proof
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Theorem 1 (Lévy-Cramer) Let X 1 <strong>and</strong> X 2 be two independent r<strong>and</strong>om variables<br />
<strong>and</strong> Y = X 1 + X 2 . Then, if Y has a Gaussian distribution, then X 1 <strong>and</strong><br />
X 2 will be Gaussian, too.<br />
Recall: The characteristic function of the r<strong>and</strong>om variable X is defined as:<br />
<strong>and</strong> <strong>its</strong> second characteristic function is:<br />
Φ X (ω) = E { e jωX} (8)<br />
Ψ X (ω) = ln Φ X (ω) (9)<br />
Theorem 2 (Marcinkiewics-Dugué) The only r<strong>and</strong>om variables which have<br />
the characteristic functions of the form e p(ω) where p(ω) is a polynomial, are the<br />
constant r<strong>and</strong>om values <strong>and</strong> Gaussian r<strong>and</strong>om variables (<strong>and</strong> hence the degree<br />
of p is less than or equal to 2).<br />
Theorem 3 (<strong>Darmois</strong>-<strong>Skitovic</strong>) Let X 1 , . . . , X N be N independent r<strong>and</strong>om<br />
variables. Let: {<br />
Y1 = a 1 X 1 + · · · + a N X N<br />
Y 2 = b 1 X 1 + · · · + b N X N<br />
(10)<br />
<strong>and</strong> suppose that Y 1 <strong>and</strong> Y 2 are independent. Now, if for an i we have a i b i ≠ 0,<br />
then X i must be Gaussian.<br />
This <strong>theorem</strong>, which is the base of blind source separation (from it, the separability<br />
of linear instantaneous mixtures is obvious), states that a r<strong>and</strong>om variable<br />
which is not Gaussian cannot appears as a summation term in two independent<br />
r<strong>and</strong>om variables.<br />
Proof: Without losing the generality, we can assume a i b j − a j b i ≠ 0 for all<br />
i ≠ j (otherwise, we can combine two r<strong>and</strong>om variables to define another one,<br />
Guassianity of this r<strong>and</strong>om variable, proves the Gaussianity of both, because of<br />
Lévy-Cramer <strong>theorem</strong>). Now, we write:<br />
{e j(ω1Y1+ω2Y2)}<br />
Φ Y1Y 2<br />
(ω 1 , ω 2 ) = E<br />
= E<br />
{e j ∑ }<br />
i (aiω1+biω2)Xi<br />
= Φ X1 (a 1 ω 1 + b 1 ω 2 )Φ X2 (a 2 ω 1 + b 2 ω 2 ) · · · Φ XN (a N ω 1 + b N ω 2 )<br />
(11)<br />
The last equation arises from the independence of X i ’s. But, independence of<br />
Y 1 <strong>and</strong> Y 2 implies that:<br />
<strong>and</strong> hence:<br />
Φ Y1Y 2<br />
(ω 1 , ω 2 ) = Φ Y1 (ω 1 )Φ Y2 (ω 2 ) (12)<br />
Φ X1 (a 1 ω 1 + b 1 ω 2 )Φ X2 (a 2 ω 1 + b 2 ω 2 ) · · · Φ XN (a N ω 1 + b N ω 2 ) = Φ Y1 (ω 1 )Φ Y2 (ω 2 )<br />
(13)<br />
taking the logarithm of the both sides gives us:<br />
Ψ X1 (a 1 ω 1 +b 1 ω 2 )+Ψ X2 (a 2 ω 1 +b 2 ω 2 )+· · ·+Ψ XN (a N ω 1 +b N ω 2 ) = Ψ Y1 (ω 1 )+Ψ Y2 (ω 2 )<br />
(14)<br />
Now if we first move all the term of the left side for them a i b i = 0 to the right<br />
side, <strong>and</strong> then apply the Lemma 1, we conclude that if for an i, a i b i ≠ 0, then<br />
Ψ Xi must be a polynomial. Hence, from Marcinkiewics-Dugué <strong>theorem</strong>, it must<br />
be a Gaussian r<strong>and</strong>om variable.<br />
2