10.07.2015 Views

ISyE 6739 — Test 3 Solutions — Summer 2011

ISyE 6739 — Test 3 Solutions — Summer 2011

ISyE 6739 — Test 3 Solutions — Summer 2011

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1NAME →<strong>ISyE</strong> <strong>6739</strong> <strong>—</strong> <strong>Test</strong> 3 <strong>Solutions</strong> <strong>—</strong> <strong>Summer</strong> <strong>2011</strong>(revised 7/28/12)This test is 100 minutes long. You are allowed three cheat sheets.1. Let X be the outcome of a 4-sided die toss. Find Var(e X + 1).Solution: By the law of the Unconscious Statistician,Similarly,Then4∑E[e X ] = e i P(X = i) = 1i=144∑e i = 21.198.i=14∑E[e 2X ] = e 2i P(X = i) = 861.593.i=1Var(e X + 1) = Var(e X ) = E[e 2X ] − (E[e X ]) 2 = 412.2.♦2. Suppose that X ∼ 1 + Exp(1/2), where the time units are in years. X couldrepresent the lifetime of a lightbulb that is guaranteed to last at least a year.Find the median of X, that is, the point m such that P(X ≤ m) = P(X > m) = 0.5.Solution: Let Y ∼ Exp(1/2). Then0.5 = P(X ≤ m) = P(1 + Y ≤ m) = P(Y ≤ m − 1) = 1 − e −(m−1)/2 .This is true iff− m − 12iff m = 1 − 2ln(0.5) = 1 + 2ln(2). ♦= ln(0.5)3. Suppose that the lifetime of a transistor is exponential with a mean of 100,000hours. Further suppose that the transistor has already survived 400,000 hours.Find the probability that it will not fail in the next 200,000 hours.


2Solution: P(X ≥ 600000|X ≥ 400000) = P(X ≥ 200000) = e −λx = e −2 = 0.1353.♦4. Suppose X 1 and X 2 are i.i.d. Bernoulli(p) random variables, which represent thefunctionality of two network components. Think of a signal passing througha network, where X i = 1 if the signal can successfully get through componenti, for i = 1, 2 (and X i = 0 if the signal is unsuccessful). Let’s considertwo set-ups: (A) X 1 and X 2 have p = 0.8 and are hooked up in a series sothat a signal getting through the network has to pass through components1 AND 2. (B) X 1 and X 2 have p = 0.5 and are hooked up in parallel sothat a signal getting through the network has to pass through components 1 OR2. Which series is more reliable, i.e., more likely to permit a signal to pass through?Solution: P(A) = P(X 1 = 1 ∩ X 2 = 1) = P(X 1 = 1)P(X 2 = 1) = 0.64.Meanwhile,P(B) = P(X 1 = 1 ∪ X 2 = 1) = P(X 1 = 1) + P(X 2 = 1) − P(X 1 = 1 ∩ X 2 = 1)= P(X 1 = 1) + P(X 2 = 1) − P(X 1 = 1)P(X 2 = 1) = 0.75.Therefore, (B) is more reliable.♦5. TRUE or FALSE? If X is any normal distribution, then about 99.7% of allobservations from X will fall within three standard deviations of the mean.Solution: TRUE.♦6. TRUE or FALSE? The normal quantile value Φ −1 (0.95) = 1.96.Solution: FALSE. Φ −1 (0.95) = 1.645.♦7. TRUE or FALSE? If X ∼ Nor(µ, σ 2 ), then P(−c ≤ X−µσc > 0.≤ c) = 2Φ(c) − 1 for anySolution: TRUE.♦


38. Suppose that X and Y are the scores that a Georgia Tech student will receive,respectively, on the verbal and math portions of the SAT test. Further supposethat X and Y are both Nor(600, 40000) and that Cov(X, Y ) = 5000. Find theprobability that the total score, X + Y , will exceed 1500. (You can assume thatX + Y is normal.)Solution: Note that E[X + Y ] = 1200 andVar(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) = 90000.Therefore, X + Y ∼ N(1200, 90000). This implies that(P(X + Y > 1500) = P Z >1500 − 1200√90000)= P(Z > 1) = 0.1587. ♦9. If X 1 , . . . , X 400 are i.i.d. from some distribution with mean 0 and variance 400,find the approximate probability that the sample mean ¯X is between −1 and 1.Solution: By the Central Limit Theorem, we have ¯X ≈ Nor(0, 1). Thus,P(−1 ≤ ¯X ≤ 1) ≈ 2Φ(1) − 1 = 2(0.8413) − 1 = 0.6826. ♦10. TRUE or FALSE? Consider any i.i.d. sequence of random variables having finitevariance. Then the Central Limit Theorem says that a properly standardizedsample mean can be approximated by a standard normal random variable as thesample size becomes large.Solution: TRUE.♦11. TRUE or FALSE? t 0.015,10 > z 0.015 .Solution: TRUE. The t-distribution has fatter tails than the standard normal <strong>—</strong>you don’t even need tables for this one. ♦12. Suppose T ∼ t(223). What’s P(T > −1.28)?Solution: Because of the high d.f., P(T > −1.28) ≈ P(Z > −1.28) = 0.90.♦


413. TRUE or FALSE? P(F (4, 3) < F 0.975,4,3 ) = P(F (4, 3) < 1/F 0.025,4,3 ).Solution: FALSE. The F 0.975,4,3 ) = 1/F 0.025,3,4 (flip the d.f.).♦14. Suppose X 1 , . . . , X 7 ∼ Nor(3, 6), Y 1 , . . . , Y 7 ∼ Nor(−3, 2), and everything isindependent. Let S 2 X and S 2 Y denote the sample variances of the X i ’s and Y j ’s,respectively. Name the distribution (with parameter(s)) of S 2 X/S 2 Y .Solution: SX 2 ∼ σXχ 2 2 (6)/6 and SY2σY 2 = Var(Y j ) = 2. Therefore,∼ σ 2 Y χ 2 (6)/6, where σ 2 X = Var(X i ) = 6 andS 2 XS 2 Y∼ σ2 Xχ 2 (6)/6σ 2 Y χ 2 (6)/6∼ 3F (6, 6). ♦15. The t distribution with 1 degree of freedom is also known as thedistribution. (Fill in the blank.)Solution: Cauchy.♦16. Suppose X 1 , X 2 , X 3 are i.i.d. Exp(λ), and we observe X 1 = 1, X 2 = 7, and X 3 = 4.What is the sample variance of the X i ’s?Solution: ¯X = 4 and n = 3. Thus,S 2 =∑ ni=1X 2 i − n ¯X 2n − 1= 9. ♦17. Suppose X 1 , X 2 , X 3 are i.i.d. Nor(µ, σ 2 ), and we observe X 1 = 1, X 2 = 7, andX 3 = 4. What is the sample variance of the X i ’s?Solution: ¯X = 4 and n = 3. Thus,S 2 =∑ ni=1Xi 2 − n ¯X 2= 9. ♦n − 1


518. Suppose X 1 , X 2 , X 3 are i.i.d. Nor(µ, σ 2 ), and we observe X 1 = 1, X 2 = 7, andX 3 = 4. What is the maximum likelihood estimate of σ 2 ?Solution: ¯X = 4 and n = 3. Thus,̂σ 2 = n − 1n S2 = 6. ♦19. TRUE or FALSE? Suppose X 1 , . . . , X n are i.i.d. Exp(λ). Then the sample mean¯X is unbiased for E[X i ].Solution: TRUE.♦20. TRUE or FALSE? Suppose X 1 , . . . , X n are i.i.d. Exp(λ). Then 1/ ¯X is unbiasedfor λ.Solution: FALSE.♦21. TRUE or FALSE? Suppose X 1 , . . . , X n are i.i.d. Exp(λ). Then 1/ ¯X is the MLEfor λ.Solution: TRUE.♦22. Suppose that X 1 , X 2 , X 3 are i.i.d. Geom(p). Thus, for all i, we haveP(X i = k) = (1 − p) k−1 p, for k = 1, 2, . . .. In particular, suppose that weobserve X 1 = 1, X 2 = 7, and X 3 = 4. What is the maximum likelihood estimateof p?Solution: ˆp = 1/ ¯X = 0.25.♦23. Suppose X 1 , X 2 , X 3 are i.i.d. Geom(p), and we observe X 1 = 1, X 2 = 7, andX 3 = 4. What is the maximum likelihood estimate of P(X i ≥ 3)?


6Solution: By the Invariance Property of MLE’s and the fact that ˆp = 1/ ¯X = 0.25,we havêP(X i ≥ 3) = 1 − ̂P(X i ≤ 2) = 1 − ̂P(X i = 1) − ̂P(X i = 2)= 1 − ˆp − ˆp(1 − ˆp) = 0.5625. ♦24. Suppose X 1 , X 2 , X 3 are i.i.d. Unif(0, θ), and we observe X 1 = 1, X 2 = 7, andX 3 = 4. What is the maximum likelihood estimate of θ?Solution: ˆθ = max i X i = 7.♦25. Suppose Suppose X 1 , X 2 , X 3 are i.i.d. Nor(µ, 16). Define two estimators for µ:T 1 ≡ (X 1 + X 2 )/2 and T 2 ≡ (4X 1 + 3X 2 + X 3 )/8. Which of T 1 or T 2 has thesmaller MSE?Solution: It is easy to show that E[T 1 ] = E[T 2 ] = µ, so that both estimators areunbiased for µ. Thus, in this case, MSE(T 1 ) = Var(T 1 ) and MSE(T 2 ) = Var(T 2 ).First, Var(T 1 ) = Var((X 1 + X 2 )/2) = [Var(X 1 ) + Var(X 2 )]/4 = 8.Similarly, Var(T 2 ) = Var((4X 1 + 3X 2 + X 3 )/8) = [16Var(X 1 ) + 9Var(X 2 ) +Var(X 3 )]/64 = 6.5.Therefore, T 2 has lower MSE. ♦.26. Which family member is actually an estimation method?(a) DAD(b) MOM(c) BRO(d) SISSolution: (b) MOM (method of moments).♦


727. Suppose X 1 , . . . , X 9 are i.i.d. normal with unknown mean and known varianceσ 2 = 49. Further suppose that ¯X = −50 and S 2 = 80. Find a 95% two-sidedconfidence interval for µ.Solution:µ ∈ ¯X√± z α/2√σ 2 /n = −50 ± 1.96 49/9 = −50 ± 4.57 = [−54.57, −45.43]. ♦28. Suppose X 1 , . . . , X 9 are i.i.d. normal with unknown mean and known varianceσ 2 = 49. Further suppose that ¯X = −50 and S 2 = 80. Find a 95% two-sidedconfidence interval for 2µ − 4.Solution: For any confidence interval for the mean µ with lower and upper boundsL and U, we have1 − α = P(L ≤ µ ≤ U) = P(2L − 4 ≤ 2µ − 4 ≤ 2U − 4).Using the L and U bounds from Question 27, we have2µ − 4 ∈ [−113.14, −94.86]. ♦29. Suppose X 1 , . . . , X 9 are i.i.d. normal with unknown mean and unknown varianceσ 2 . Further suppose that ¯X = −50 and S 2 = 80. Find a 95% two-sided confidenceinterval for µ.Solution: Since t α/2,n−1 = t 0.025,2.306 , we haveµ ∈ ¯X√±t α/2,n−1√S 2 /n = −50±2.306 80/9 = −50±6.88 = [−56.88, −43.12]. ♦30. Suppose that X 1 , . . . , X n are i.i.d. Bernoulli with unknown mean p, and that wehave carried out a preliminary investigation suggesting p ≈ 0.8. How big would nhave to be in order for a two-sided 95% confidence interval to have a half-lengthof 0.01? (Give the smallest such number.)Solution: z 2 α/2n ≥ z 2 α/2ˆp(1 − ˆp)n≤ ε 2 . So,ˆp(1 − ˆp)ε 2 = 1.962 · 0.8 · 0.2(0.01) 2 = 6147 (rounded up). ♦


831. Suppose X 1 , . . . , X 5 are i.i.d. normal with unknown mean and unknown varianceσ 2 . Further suppose that ¯X = 100 and S 2 = 25. Find a 95% two-sided confidenceinterval for σ 2 .Solution:⎡σ 2 (n − 1)S2∈ ⎣ ,χ 2 α2 ,n−1⎤(n − 1)S2⎦ =χ 2 1− α 2 ,n−1[ ] 10011.14 , 1000.48= [8.98, 208.33]. ♦32. Consider i.i.d. normal observations X 1 , . . . , X 5 with unknown mean µ and unknownvariance σ 2 . What is the expected width of the usual 90% two-sided confidenceinterval for σ 2 ? You can keep your answer in terms of σ.Solution: The confidence interval is of the form⎡⎤σ 2 (n − 1)S2 (n − 1)S2∈ ⎣ , ⎦ .χ 2 α2 ,n−1 χ 2 1− α 2 ,n−1Thus, the length isL =and so the expected length is⎡1L = (n − 1) ⎣ − 1χ 2 1− α 2 ,n−1 χ 2 α⎡⎣(n − 1)S2χ 2 1− α 2 ,n−1 −2 ,n−1 ⎤⎤(n − 1)S2⎦ ,χ 2 α2 ,n−1[⎦ 1· E[S 2 ] = 40.71 − 19.49]σ 2 = 5.21σ 2 . ♦33. Suppose we conduct an experiment to test to see if people can throw farther rightorleft-handed. We get 20 people to do the experiment. Each throws a ball righthandedonce and throws a ball left-handed once, and we measure the distances. Ifwe are interested in determining a confidence interval for the mean difference inleft- and right-handed throws, which type of c.i. would we likely use?(a) z (normal) confidence interval for differences(b) pooled t confidence interval for differences(c) paired t confidence interval for differences(d) χ 2 confidence interval for differences


9(e) F confidence interval for differencesSolution: (c)♦34. TRUE or FALSE? We fail to reject the null hypothesis if we are not givenstatistically significant evidence that it is false.Solution: TRUE.♦


10Table 1: Standard normal valuesz P(Z ≤ z)1 0.84131.28 0.90001.5 0.93321.645 0.95001.96 0.97502 0.9773Table 2: χ 2 α,ν valuesν \ α 0.975 0.95 0.90 0.50 0.10 0.05 0.0253 0.22 0.35 0.58 2.37 6.25 7.81 9.354 0.48 0.71 1.06 3.36 7.78 9.49 11.145 0.83 1.15 1.61 4.35 9.24 11.07 12.836 1.24 1.64 2.20 5.35 10.65 12.59 14.45Table 3: t α,ν valuesν \ α 0.10 0.05 0.0257 1.415 1.895 2.3658 1.397 1.860 2.3069 1.383 1.833 2.26210 1.372 1.812 2.228Table 4: F 0.025,m,n valuesn \ m 3 4 53 15.44 15.10 14.884 9.98 9.60 9.365 7.76 7.39 7.15

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!