08.10.2016 Views

Foundations of Data Science

2dLYwbK

2dLYwbK

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

5.10 Exercises<br />

Exercise 5.1 The Fundamental Theorem <strong>of</strong> Markov chains proves that for a connected<br />

Markov chain, the long-term average distribution a t converges to a stationary distribution.<br />

Does the t step distribution p t also converge for every connected Markov Chain ? Consider<br />

the following examples: (i) A two-state chain with p 12 = p 21 = 1. (ii) A three state chain<br />

with p 12 = p 23 = p 31 = 1 and the other p ij = 0. Generalize these examples to produce<br />

Markov Chains with many states.<br />

Exercise 5.2 Let p(x), where x = (x 1 , x 2 , . . . , x d ) x i ∈ {0, 1}, be a multivariate probability<br />

distribution. For d = 100, how would you estimate the marginal distribution<br />

p(x 1 ) =<br />

∑<br />

p(x 1 , x 2 , . . . , x d )?<br />

x 2 ,...,x d<br />

Exercise 5.3 Prove |p − q| 1 = 2 ∑ i (p i − q i ) + for probability distributions p and q.<br />

Proposition 5.4<br />

Exercise 5.4 Suppose S is a subset <strong>of</strong> at most n 2 /2 points in the n × n lattice. Show<br />

that for<br />

|T | ≤ |S|/2.<br />

T = {(i, j) ∈ S ∣ ∣all elements in row i and all elements in column j are in S}<br />

Exercise 5.5 Show that the stationary probabilities <strong>of</strong> the chain described in the Gibbs<br />

sampler is the correct p.<br />

Exercise 5.6 A Markov chain is said to be symmetric if for all i and j, p ij = p ji . What<br />

is the stationary distribution <strong>of</strong> a connected symmetric chain? Prove your answer.<br />

Exercise 5.7 How would you integrate a high dimensional multivariate polynomial distribution<br />

over some convex region?<br />

this exercise needs to be clarified, ARE WE IN HIGH DIMENSIONS, IS<br />

REGION CONVEX?<br />

Exercise 5.8 Given a time-reversible Markov chain, modify the chain as follows. At the<br />

current state, stay put (no move) with probability 1/2. With the other probability 1/2,<br />

move as in the old chain. Show that the new chain has the same stationary distribution.<br />

What happens to the convergence time in this modification?<br />

Exercise 5.9 Using the Metropolis-Hasting Algorithm create a Markov chain whose stationary<br />

probability is that given in the following table.<br />

x 1 x 2 00 01 02 10 11 12 20 21 22<br />

Prob 1/16 1/8 1/16 1/8 1/4 1/8 1/16 1/8 1/16<br />

180

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!