2 - People.stat.sfu.ca
2 - People.stat.sfu.ca
2 - People.stat.sfu.ca
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
STAT 801<br />
Solutions: Asst 2<br />
1. Suppose X has the Beta(α, β) density<br />
f(x; α, β) =<br />
Find the distribution of Y = X/(1 − X).<br />
Γ(α + β)<br />
Γ(α)Γ(β) xα−1 (1 − x) β−1 1(0 < x < 1)<br />
Solution: the inverse transformation is x = y/(1 + y) for y ≠ −1. The<br />
derivative dx/dy is (1 + y) −2 . So the density of Y is<br />
f Y (y) = f X {y/(1 + y)}/(1 + y) 2<br />
=<br />
Γ(α + β) y α−1<br />
1(0 < y/(1 + y) < 1)<br />
Γ(α)Γ(β) (1 + y)<br />
α+β<br />
The indi<strong>ca</strong>tor simplifies to 1(0 < y). This is an F density.<br />
2. In class I showed<br />
f(x 1 , x 2 ) = 24x 1 x 2 1(0 < x 1 )1(0 < x 2 )1(x 1 + x 2 < 1)<br />
is a density. If (X 1 , X 2 ) has this density what is the distribution of<br />
X 1 + X 2 ?<br />
Solution: make the change of variables Y 1 = X 1 + X 2 and Y 2 = X 2 .<br />
The Jacobian is 1. The joint density of Y 1 and Y 2 is<br />
f Y1 ,Y 2<br />
(y 1 , y 2 ) = 24(y 1 − y 2 )y21(0 < y 1 − y 2 )1(0 < y 2 )1(y 1 < 1)<br />
The indi<strong>ca</strong>tors simplify to 1(0 < y 2 < y 1 < 1). Integrate over y 2 . The<br />
limits, from the indi<strong>ca</strong>tor, for this integral are 0 and y 1 so<br />
which simplifies to<br />
f Y1 (y 1 ) = 24<br />
∫ y1<br />
3. Suppose X and Y are iid N(0, σ 2 ).<br />
0<br />
(y 1 y 2 − y 2 2 )dy 21(0 < y 1 < 1).<br />
4y 3 1 1(0 < y 1 < 1).<br />
1
(a) Show that X 2 + Y 2 and X/(X 2 + Y 2 ) 1/2 are independent.<br />
The event U ≡ X 2 + Y 2 < u and −1 < T ≡ X/(X 2 + Y 2 ) 1/2 ≤ t<br />
<strong>ca</strong>n be written in polar co-ordinates as r 2 ≤ u and −1 < cos θ ≤ t.<br />
Let A denote this set of (x, y) pairs in the xy plane. Then<br />
∫<br />
P ((X, Y ) ∈ A) =<br />
A<br />
e −(x2 +y 2 )/2<br />
2π<br />
dxdy<br />
which <strong>ca</strong>n be transformed to polar co-ordinates. The limits on r<br />
will be 0 and u 1/2 . To get the limits on θ you have to remember<br />
that the cosine is negative from π/2 to 3π/2 and so you integrate<br />
from arccos(t) (a number between 0 and π as I define it here) to<br />
2π − arccos(t). You get<br />
P ((X, Y ) ∈ A) =<br />
∫ u 1/2 ∫ 2π−arccos(t)<br />
e −r2 /2<br />
0<br />
arccos(t)<br />
which becomes<br />
∫ u 1/2<br />
2(π − arccos(t)) re −r2 /2 dr/(2π)<br />
0<br />
Do the integral to get<br />
dθrdr/(2π)<br />
P (U ≤ u, T ≤ t) = [1 − arccos(t) ](1 − e −u/2 )<br />
π<br />
for u > 0 and −1 ≤ t ≤ 1. This factors so that U and T are<br />
independent. The cdf of T is thus<br />
for −1 ≤ t ≤ 1.<br />
F T (t) = 1 − arccos(t)<br />
π<br />
(b) Show that Θ = arcsin(X/(X 2 + Y 2 ) 1/2 ) is uniformly distributed<br />
on (−π/2, π/2].<br />
2
We have, for π/2 ≤ θ ≤ π/2,<br />
P (Θ ≤ θ) =P (arcsin(T ) ≤ θ)<br />
=P (T ≤ sin(θ))<br />
=1 − arccos(sin(θ))<br />
π<br />
arccos(cos(θ − π/2))<br />
=1 −<br />
π<br />
arccos(cos(π/2 − θ))<br />
=1 −<br />
π<br />
= 1 − π/2 − θ<br />
π<br />
= (θ + π/2)/π<br />
which is the cdf of the required distribution.<br />
(c) Show X/Y is a Cauchy random variable.<br />
Suppose z > 0. The event X/Y ≤ z corresponds to the region<br />
y > 0 and x < zy together with the region y < 0 and x > zy. In<br />
polar co-ordinates this has 0 < r < ∞ and either<br />
or<br />
We get<br />
P (X/Y ≤ z) =<br />
which is just<br />
tan −1 (1/z) ≤ θ ≤ π/2<br />
tan −1 (1/z) + π ≤ θ ≤ 3π/2<br />
∫ ∞<br />
0<br />
e −r2 /2 rdr<br />
[ ∫ π/2<br />
tan −1 (1/z)<br />
∫ ]<br />
3π/2<br />
dθ<br />
+<br />
tan −1 (1/z)+π 2π<br />
2(π/2 − tan −1 (1/z))<br />
2π<br />
Differentiate to get the density<br />
= 1/2 − tan −1 (1/z)/π<br />
f(z) =<br />
3<br />
1<br />
π(1 + z 2 ) .
4. Suppose X is Uniform on [0,1] and Y = sin(4πX). Find the density of<br />
Y .<br />
Solution: This transformation is many to 1. For each y ∈ (−1, 1) there<br />
are 4 corresponding x values which map to y. The density of Y is<br />
∑<br />
1/|dy/dx|<br />
x:sin(4πx)=y<br />
We find dy/dx = 4π cos(x) and |dy/dx| = 4π √ 1 − sin 2 (x). Evaluated<br />
at any x for which sin(4πx) = y this is 4π √ 1 − y 2 . This makes<br />
1<br />
f Y (y) = 4<br />
4π √ 1 − y 1(−1 < y < 1) = 1<br />
2 π √ 1(−1 < y < 1).<br />
1 − y2 5. Suppose X and Y have joint density f(x, y). Prove from the definition<br />
of density that the density of X is g(x) = ∫ f(x, y) dy.<br />
Solution: I defined g to be the density of X provided<br />
∫<br />
P (X ∈ A) = g(x)dx<br />
but<br />
∫<br />
A<br />
∫<br />
g(x)dx =<br />
∫<br />
=<br />
A<br />
∫<br />
A×(−∞,∞)<br />
A<br />
f(x, y) dydx<br />
f(x, y)dxdy<br />
= P ((X, Y ) ∈ A × (−∞, ∞))<br />
= P (X ∈ A)<br />
Notice that I have used the fact that f is the density of (X, Y ) in the<br />
middle of this.<br />
6. Suppose X is Poisson(θ). After observing X a coin landing Heads with<br />
probability p is tossed X times. Let Y be the number of Heads and Z<br />
be the number of Tails. Find the joint and marginal distributions of Y<br />
and Z.<br />
4
Solution: Start with the joint:<br />
P (Y = y, Z = z) = P (X = y + z, Y = y)<br />
= P (Y = y|X = y + z)P (X = y + z)<br />
( )<br />
y + z<br />
=<br />
p y (1 − p) y+z−y exp(−λ)λ y+z /(y + z)!<br />
y<br />
= (pλ) y exp(−λp)[(1 − p)λ] z exp(−(1 − p)λ)/(y!z!)<br />
which clearly factors into the product of two Poisson probability mass<br />
functions. Thus Y and Z are independent Poissons with means pλ and<br />
(1 − p)λ.<br />
7. Let p 1 be the bivariate normal density with mean 0, unit variances and<br />
correlation ρ and let p 2 be the standard bivariate normal density. Let<br />
p = (p 1 + p 2 )/2.<br />
(a) Show that p has normal margins but is not bivariate normal.<br />
The margins of p are the averages of the margins of p 1 and p 2 .<br />
In class I told you the margins of normals are normal so you<br />
only need to know the means and variances of the margins to see<br />
which normal. Both p 1 and p 2 have means 0 and variances 1 so the<br />
margins are standard normal. Thus the margins of p are standard<br />
normal. It follows that p has means 0 and variances 1. If p were<br />
bivariate normal it would have to have correlation ρ/2 (to see this<br />
compute corr = ∫ ∫ xyp(x, y)dx dy). Now check that at x = 0,<br />
y = 0 p(0, 0) is not equal to the density of a bivariate normal with<br />
mean 0, variances 1 and correlation ρ/2. Easier to check is this:<br />
E(X 2 Y 2 ) = [E 1 (X 2 Y 2 ) + E 2 (X 2 Y 2 )]/2<br />
But if (X, Y ) has density p 1 then the conditional distribution of<br />
Y given X = x is N(ρx, 1 − ρ 2 ) so<br />
and<br />
E(Y 2 |X) = V ar(Y |X) + E 2 (Y |X) = 1 − ρ 2 + ρ 2 X 2<br />
E(X 2 Y 2 ) = E[X 2 E(Y 2 |X)]<br />
= E[X 2 (1 − ρ 2 + ρ 2 X 2 )<br />
= 1 − ρ 2 + ρ 2 E(X 4 )<br />
= 1 + 2ρ 2<br />
5
Since p 2 is a special <strong>ca</strong>se of p 1 with ρ = 0 we find<br />
which is not<br />
unless ρ = 0.<br />
E(X 2 Y 2 ) = [1 + 2ρ 2 + 1]/2 = 1 + ρ 2<br />
E ρ/2 (X 2 Y 2 ) = 1 + 2(ρ/2) 2 = 1 + ρ 2 /2<br />
(b) Generalize the construction to show that there rv’s X and Y such<br />
that X and Y are each standard normal, X and Y are uncorrelated<br />
but X and Y are not independent.<br />
Define p 1 as in the first part and p 2 like p 1 but with correlation<br />
−ρ. The density p = (p 1 + p 2 )/2 has standard normal margins<br />
and correlation 0 but does not factor. In this <strong>ca</strong>se<br />
E(X 2 Y 2 ) = 1 + 2ρ 2 ≠ 1<br />
unless ρ = 0. Here is a contour plot of the joint density when<br />
ρ = 0.8.<br />
6
y<br />
-3 -2 -1 0 1 2 3<br />
-3 -2 -1 0 1 2 3<br />
x<br />
8. Suppose X and Y are independent with X ∼ N(µ, σ 2 ) and Y ∼<br />
N(γ, τ 2 ). Let Z = X + Y . Find the distribution of Z given X and that<br />
of X given Z.<br />
7
I intended you to: find the joint density of X, Z by change of variables,<br />
and then divide joint by marginal to get the conditional densities.<br />
Here is one way to do part of the problem. Consider first the following<br />
problem: U, V are bivariate normal with mean 0 and variance covariance<br />
matrix [<br />
1<br />
]<br />
ρ<br />
ρ 1<br />
Then<br />
f U,V (u, v) =<br />
1<br />
2π √ − 2ρuv + v 2<br />
1 − ρ 2 exp{−u2 }<br />
2(1 − ρ 2 )<br />
Rewrite u 2 − 2ρuv + v 2 as (u − ρv) 2 + (1 − ρ 2 )v 2 to see that<br />
f V (v) =<br />
1<br />
2π √ 1 − ρ 2 e−v2 /2<br />
∫ ∞<br />
−∞<br />
e −(u−ρv)2 /[2(1−ρ 2 )] du<br />
In the integral substitute y = (u − ρv)/ √ 1 − ρ 2 and find that<br />
f V (v) = 1 √<br />
2π<br />
e −v2 /2<br />
The conditional density of U given V = v is then<br />
f U|V (u|v) = f U,V (u, v)<br />
f V (v)<br />
which is the N(ρv, 1 − ρ 2 ) density.<br />
√<br />
(u − ρv)2<br />
= exp{−<br />
2(1 − ρ 2 ) − v2<br />
2 + v2 2π<br />
2 } 2π<br />
= exp{−(u − ρv) 2 /[2(1 − ρ 2 )]}/ √ 2π(1 − ρ 2 )<br />
Now if X = σU + µ and Z = φV + µ + γ then X, Z has the joint<br />
distribution of X and X + Y of the original problem provided we make<br />
ρ = Corr(X, Z). For X and Z as in the problem we find φ 2 = σ 2 + τ 2 ,<br />
the covariance is σ 2 , and the correlation is<br />
ρ =<br />
1<br />
√<br />
1 + τ<br />
2<br />
/σ 2 =<br />
σ<br />
√<br />
σ2 + τ 2 = σ φ<br />
8
Thus X, Z is a change of variables from U, V . We get<br />
f X,Z (x, z) = 1<br />
σφ f U,V [(x − µ)/σ, (z − µ − γ)/φ]<br />
and<br />
so that<br />
f Z (z) = 1 φ f V [(z − µ − γ)/φ]<br />
f X|Z (x|z) = 1 σ f U|V ([(x − µ)/σ, (z − µ − γ)/φ]<br />
Plug in to the formula for f U|V<br />
to get<br />
f X|Z (x|z) =<br />
The exponent is<br />
1<br />
σ √ 1 − ρ 2 exp{−[(x−µ)/σ −ρ((z −µ−γ)/φ]2 /[2(1−ρ 2 )]}<br />
[x − µ − ρσ(z − µ − γ)/φ]2<br />
−<br />
2σ 2 (1 − ρ 2 )<br />
which means that the conditional distribution of X given Z = z is<br />
normal with mean<br />
σ<br />
µ + ρ√ (z − µ − γ) = µ +<br />
σ2<br />
(z − µ − γ)<br />
σ2 + τ<br />
2 σ 2 + τ<br />
2<br />
and variance<br />
σ 2 (1 − ρ 2 ) = σ2 τ 2<br />
σ 2 + τ 2<br />
9