10.07.2015 Views

Statistics I Exercises

Statistics I Exercises

Statistics I Exercises

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

(b) We first assume that X and Y are samples from a jointly Gaussian distribution withparameters computed in question 1. Compute the q-percentile with q = 2% of thevariables X + Y and X − Y .(c) Fit a generalized Pareto distribution (GPD) to X and Y separately, and fit a copula ofthe Gumbel family to the empirical copula of the data.(d) Generate a sample of size N (where N is the number of rows of the data matrix) fromthe joint distribution estimated in question 3. Use this sample to compute the samestatistics as in question 1 (i.e., means and standard deviations of the columns, as well astheir correlation coefficients), and compute the results to the numerical values obtainedin question 1. Compute, still for this simulated sample, the two percentiles consideredin question 2, compare the results, and comment.61. Show that if X is a random variable with distribution symmetric about 0 and finite secondmoments, then X and |X| are uncorrelated.62. This elementary exercise is intended to give an example showing that lack of correlation doesnot necessarily mean independence. Let us assume that X ∼ N(0, 1) and let us define therandom variable Y by Y = (|X| − √ 2/π)/ √ 1 − 2/π.(a) Compute E(|X|).(b) Show that Y has mean zero, variance 1, and that it is uncorrelated with X.63. The purpose of this problem is to show that lack of correlation does not imply independence,even when the two random variables are Gaussian. We assume that X, ɛ 1 and ɛ 2 areindependent random variables, that X ∼ N(0, 1), and that P(ɛ i = −1) = P(ɛ i = +1) = 1/2for i = 1, 2. We define the random variable X 1 and X 2 by X 1 = ɛ 1 X, and X 2 = ɛ 2 X.(a) Prove that X 1 ∼ N(0, 1), X 2 ∼ N(0, 1) and that ρ(X 1 , X 2 ) = 0.(b) Show that X 1 and X 2 are not independent.(c) Determine the copula of X 1 and X 2 .64. The goal of this problem is to prove rigorously a couple of useful formulae for normal randomvariables.(a) Show that, if Z ∼ N(0, 1), if σ > 0, and if f is ANY function, then we have:E(f(Z)e σZ ) = e σ2 /2 E(f(Z + σ)),and use this formula to recover the well known factE(e X ) = e µ+σ2 /2 ,whenever X ∼ N(µ, σ 2 ).(b) We now assume that X and Y are jointly-normal mean-zero random variables and thath is ANY function. Prove that:E(e X h(Y )) = E(e X )E(h(Y + cov(X, Y ))).65. This problem concerns the computation in R of the square root of a symmetric nonnegativedefinitesquare matrix.Write a function, call it msqrt, with argument A which: Checks that A is a square matrix,is symmetric and diagonalizes the matrix by comupting the eigenvalues and the matrix ofthe eigenvectors (hint: eigen). Return a symmetric matrix of the same size as A, with thesame eigenvectors, the eigenvalue corresponding to a given eigenvector being the suare rootof the corresponding eigenvalues of A. The matrix returned is called the square root of thematrix A and will be denoted by A 1/2 .12

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!