06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

280 MARKOV CHAINS AND MIGRATION PROCESSES<br />

is in state i. The probability that X n+1 is in state j given that X n is in state i is called<br />

a one step transition probability. When this probability does not depend on n it is called<br />

stationary and this is the case of interest here. Since this probability does not depend on n<br />

it can be denoted by p ij . Here is a simple example called a random walk.<br />

Example 11.3.1 Let there be n points, x i , and consider a process of something moving<br />

randomly from one point to another. Suppose X n is a sequence of random variables which<br />

has values {1, 2, ··· ,n} where X n = i indicates the process has arrived at the i th point. Let<br />

p ij be the probability that X n+1 has the value j given that X n has the value i. SinceX n+1<br />

must have some value, it must be the case that ∑ j a ij =1. Note this says that the sum over<br />

a row equals 1 and so the situation is a little different than the above in which the sum was<br />

over a column.<br />

As an example, let x 1 ,x 2 ,x 3 ,x 4 be four points taken in order on R and suppose x 1<br />

and x 4 are absorbing. This means that p 4k =0forallk ≠ 4 and p 1k =0forallk ≠1.<br />

Otherwise, you can move either to the left or to the right with probability 1 2<br />

. The Markov<br />

matrix associated with this situation is<br />

⎛<br />

⎜<br />

⎝<br />

1 0 0 0<br />

.5 0 .5 0<br />

0 .5 0 .5<br />

0 0 0 1<br />

Definition 11.3.2 Let the stationary transition probabilities, p ij be defined above. The<br />

resulting matrix having p ij as its ij th entry is called the matrix of transition probabilities.<br />

The sequence of random variables for which these p ij are the transition probabilities is called<br />

a Markov chain. The matrix of transition probabilities is called a stochastic matrix.<br />

The next proposition is fundamental and shows the significance of the powers of the<br />

matrix of transition probabilities.<br />

Proposition 11.3.3 Let p n ij denote the probability that X n is in state j given that X 0 was<br />

in state i. Then p n ij is the ijth entry of the matrix P n where P =(p ij ) .<br />

Proof: This is clearly true if n = 1 and follows from the definition of the p ij . Suppose<br />

true for n. Then the probability that X n+1 is at j given that X 0 was at i equals ∑ k pn ik p kj<br />

because X n must have some value, k, and so this represents all possible ways to go from i<br />

to j. You can go from i to1inn steps with probability p i1 and then from 1 to j in one step<br />

with probability p 1j and so the probability of this is p n i1 p 1j but you can also go from i to 2<br />

andthenfrom2toj and from i to 3 and then from 3 to j etc. Thus the sum of these is<br />

just what is given and represents the probability of X n+1 having the value j given X 0 has<br />

the value i. <br />

In the above random walk example, lets take a power of the transition probability matrix<br />

to determine what happens. Rounding off to two decimal places,<br />

⎛<br />

⎜<br />

⎝<br />

1 0 0 0<br />

.5 0 .5 0<br />

0 .5 0 .5<br />

0 0 0 1<br />

⎞<br />

⎟<br />

⎠<br />

20<br />

=<br />

⎛<br />

⎜<br />

⎝<br />

⎞<br />

⎟<br />

⎠ .<br />

1 0 0 0<br />

. 67 9. 5 × 10 −7 0 . 33<br />

. 33 0 9. 5 × 10 −7 . 67<br />

0 0 0 1<br />

Thus p 21 is about 2/3 while p 32 is about 1/3 and terms like p 22 are very small. You see this<br />

seems to be converging to the matrix<br />

⎛ ⎞<br />

1 0 0 0<br />

⎜ 2<br />

1<br />

⎜⎝ 3<br />

0 0 3 ⎟<br />

1<br />

2<br />

3<br />

0 0 ⎠ .<br />

3<br />

0 0 0 1<br />

⎞<br />

⎟<br />

⎠ .

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!