28.04.2014 Views

Lecture Notes - Department of Mathematics and Statistics - Queen's ...

Lecture Notes - Department of Mathematics and Statistics - Queen's ...

Lecture Notes - Department of Mathematics and Statistics - Queen's ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

18 CHAPTER 3. CLASSIFICATION OF MARKOV CHAINS<br />

3.2 Stability <strong>and</strong> Invariant Distributions<br />

Stability is an important concept, but it has different meanings in different contexts. In a stochastic setting,<br />

stability means that the state does not grow large too <strong>of</strong>ten, <strong>and</strong> the average over time (temporal) converges to<br />

a statistical average.<br />

If a Markov chain starts at a given time, in the long-run, the chain may forget its initial condition, that is, the<br />

probability distribution at a later time will be less <strong>and</strong> less dependent on the initial distribution. Given an initial<br />

state distribution, the probability distribution on the state at time 1 is given by:<br />

π 1 = π 0 P<br />

And for t > 1:<br />

π t+1 = π t P = π 0 P t+1<br />

One important property <strong>of</strong> Markov chains is whether the above iteration leads to a fixed point in the set <strong>of</strong><br />

probability measures. Such a fixed point π is called an invariant distribution. A distribution in a countable<br />

state Markov chain is invariant if<br />

π = πP<br />

This is equivalent to<br />

π(j) = ∑ i∈X<br />

π(i)P(i, j), ∀j ∈ X<br />

We note that, if such a π exists, it must be written in terms <strong>of</strong> π = π 0 lim t→∞ P t , for some π 0 . Clearly, π 0 can<br />

be π itself, but <strong>of</strong>ten π 0 can be any initial distribution under irreducibility conditions which will be discussed<br />

further. Invariant distributions are especially important in networking problems <strong>and</strong> stochastic control, due<br />

to the Ergodic Theorem (which shows that temporal averages converge to statistical averages), which we will<br />

discuss later in the semester.<br />

3.2.1 Invariant Measures via an Occupational Characterization<br />

Theorem 3.2.1 For a Markov chain, if there exists an element i such that E i [τ i ] < ∞; the following is an<br />

invariant measure:<br />

[∑ τi−1<br />

k=0<br />

µ(j) = E<br />

1 ]<br />

x k =j<br />

|x 0 = i , ∀j ∈ X<br />

E i [τ i ]<br />

The invariant measure is unique if the chain is irreducible.<br />

Pro<strong>of</strong>:<br />

We show that<br />

[∑ τi−1<br />

k=0<br />

E<br />

1 ]<br />

x k =j<br />

|x 0 = i<br />

E[τ i ]<br />

= ∑ s<br />

[∑ τi−1<br />

k=0<br />

P(s, j)E<br />

1 ]<br />

x k =s<br />

|x 0 = i<br />

E[τ i ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!