28.04.2014 Views

Lecture Notes - Department of Mathematics and Statistics - Queen's ...

Lecture Notes - Department of Mathematics and Statistics - Queen's ...

Lecture Notes - Department of Mathematics and Statistics - Queen's ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

16 CHAPTER 3. CLASSIFICATION OF MARKOV CHAINS<br />

A set C ⊂ X is said to be communicating if every two elements (states) <strong>of</strong> C communicate with each other.<br />

If every member <strong>of</strong> the set can communicate to every other member, such a chain is said to be irreducible.<br />

The period <strong>of</strong> a state is defined to be the greatest common divisor <strong>of</strong> {k > 0 : P k (i, i) > 0}.<br />

A Markov chain is called aperiodic if the period <strong>of</strong> all states is 1.<br />

In the following, throughout the notes, we will assume that the Markov process under consideration is aperiodic<br />

unless mentioned otherwise.<br />

Absorbing Set<br />

A set C is called absorbing if P(i, C) = 1 for all i ∈ C. That is, if the state is in C, then the state cannot get<br />

out <strong>of</strong> the set C.<br />

A finite set Markov chain in space X is irreducible if the smallest absorbing set is the entire X itself.<br />

A Markov chain in X is indecomposable if X does not contain two disjoint absorbing sets.<br />

Occupation, Hitting <strong>and</strong> Stopping Times<br />

For any set A ∈ X, the occupation time η A is the number <strong>of</strong> visits <strong>of</strong> ψ to set A:<br />

η A =<br />

∞∑<br />

t=1<br />

1 {xt∈A},<br />

where 1 (E) denotes the indicator function for an event E, that is, it takes the value 1 when E takes place, <strong>and</strong><br />

is otherwise 0.<br />

Define<br />

τ A = min{k > 0 : x k ∈ A},<br />

is the first time that the state visits state i, known as the hitting time.<br />

We define also a return time:<br />

τ A = min{k > 0 : x k ∈ A, x 0 ∈ A}.<br />

A Brief Discussion on Stopping Times<br />

The variable τ A defined above is an example for stopping times:<br />

Definition 3.1.1 A function τ from a measurable space (X ∞ , B(X ∞ )) to (N + , B(N + )) is a stopping time if for<br />

all n ∈ N + , the event {τ = n} ∈ σ(x 0 , x 1 , x 2 , . . . , x n ), that is the event is in the sigma-field generated by the<br />

r<strong>and</strong>om variables up to time n.<br />

Any realistic decision takes place at a time which is measurable. For example if an investor wants to stop<br />

investing when he stops pr<strong>of</strong>iting, he can stop at the time when the investment loses value, this is a stopping<br />

time. But, if an investor claims to stop investing when the investment is at the peak, this is not a stopping time<br />

because to find out whether the investment is at its peak, the next state value should be known, <strong>and</strong> this is not<br />

measurable in a causal fashion.<br />

One important property <strong>of</strong> Markov chains is the so-called Strong Markov Property. This says the following:<br />

Consider a Markov chain which evolves on a countable set. If we sample this chain according to a stopping time<br />

rule, the sampled Markov chain starts from the sampled instant as a Markov chain:<br />

Proposition 3.1.1 For a (time-homogenous) Markov chain in a countable state space X, the strong Markov<br />

property always holds. That is, the sampled chain itself is a Markov chain; if τ is a stopping time with P(τ

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!