CHAPTER 2: Markov Chains (part 3)
CHAPTER 2: Markov Chains (part 3)
CHAPTER 2: Markov Chains (part 3)
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
<strong>CHAPTER</strong> 2: <strong>Markov</strong> <strong>Chains</strong><br />
(<strong>part</strong> 3)<br />
Example Consider a <strong>Markov</strong> Chain whose transition probability matrix is<br />
P =<br />
1 2 3<br />
1 1 0 0<br />
2 α β γ<br />
3 0 0 1<br />
where α, β, γ > 0 and α + β + γ = 1.<br />
Let T = min{n ≥ 0 : X n = 1, or X n = 3}. Find<br />
u = P (X T = 1|X 0 = 2), and v = E(T |X 0 = 2)<br />
Then<br />
u = P (X T = 1|X 0 = 2, X 1 = 1)P (X 1 = 1|X 0 = 2)<br />
+P (X T = 1|X 0 = 2, X 1 = 2)P (X 1 = 2|X 0 = 2)<br />
+P (X T = 1|X 0 = 2, X 1 = 3)P (X 1 = 3|X 0 = 2)<br />
= α + uβ + 0γ<br />
we have<br />
u =<br />
α<br />
1 − β = α<br />
α + γ<br />
v = E(T |X 0 = 2, X 1 = 1)P (X 1 = 1|X 0 = 2)<br />
+E(T |X 0 = 2, X 1 = 2)P (X 1 = 2|X 0 = 2)<br />
+E(T |X 0 = 2, X 1 = 3)P (X 1 = 3|X 0 = 2)<br />
= [1 + E(T |X 1 = 1)]P (X 1 = 1|X 0 = 2)<br />
+[1 + E(T |X 1 = 2)]P (X 1 = 2|X 0 = 2)<br />
+[1 + E(T |X 1 = 3)]P (X 1 = 3|X 0 = 2)<br />
= 1 + βv<br />
1
we have<br />
v = 1/(1 − β)<br />
Actually, in this example we can calculate v and u directly.<br />
Example A Maze A white rat is put into the maze<br />
1 2 3<br />
shock<br />
4 5 6<br />
7 8 9<br />
shock<br />
food<br />
In the absence of learning, one might hypothesize that the rat would move through the maze<br />
at random, i.e. if there are k ways to leave a com<strong>part</strong>ment, then the rat would choose each<br />
of them with equal probability 1/k. Assume that the rat makes one changes to some adjacent<br />
com<strong>part</strong>ment at each unit of time. Let X n be the com<strong>part</strong>ment occupied at stage n. Suppose<br />
that com<strong>part</strong>ment 9 contains food and 3 and 7 contain electrical shocking mechanisms.<br />
1. what is the transition probability matrix<br />
P =<br />
1 2 3 4 5 6 7 8 9<br />
1 0 1/2 0 1/2 0 0 0 0 0<br />
2 1/3 0 1/3 0 1/3 0 0 0 0<br />
3 0 0 1 0 0 0 0 0 0<br />
4 1/3 0 0 0 1/3 0 1/3 0 0<br />
5 0 1/4 0 1/4 0 1/4 0 1/4 0<br />
6 0 0 1/3 0 1/3 0 0 0 1/3<br />
7 0 0 0 0 0 0 1 0 0<br />
8 0 0 0 0 1/3 0 1/3 0 1/3<br />
9 0 0 0 0 0 0 0 0 1<br />
2. If the rat starts in com<strong>part</strong>ment 1, what is the probability that the rate encounters food<br />
before being shocked.<br />
Let T = min{n : X n = 3 or X n = 7 or X n = 9}. Let u i = E(X T = 9|X 0 = i), i,e. the<br />
probability that the rat absorbed by food. Note that there are 3 absorbing states 3, 7 and<br />
9. It is easy to see that<br />
u 3 = 0, u 7 = 0, u 9 = 1<br />
2
we have, for the others i = 1, 2, 4, 5, 6, 8,<br />
in details<br />
u i = p i1 u 1 + p i2 u 2 + p i3 u 3 + p i4 u 4 + p i5 u 5 + p i6 u 6 + p i7 u 7 + p i8 u 8 + p i9 u 9<br />
u 1 = u 0 + u 1 + u 2 + u 3 + u 4 + u 5 + u 6 + u 7 + u 8<br />
u 2 = u 0 + u 1 + u 2 + u 3 + u 4 + u 5 + u 6 + u 7 + u 8<br />
u 4 = u 0 + u 1 + u 2 + u 3 + u 4 + u 5 + u 6 + u 7 + u 8<br />
u 5 = u 0 + u 1 + u 2 + u 3 + u 4 + u 5 + u 6 + u 7 + u 8<br />
u 6 = u 0 + u 1 + u 2 + u 3 + u 4 + u 5 + u 6 + u 7 + u 8<br />
u 8 = u 0 + u 1 + u 2 + u 3 + u 4 + u 5 + u 6 + u 7 + u 8<br />
[PLEASE FILL IN THE COEFFICIENTS] Finally, we have<br />
u 1 = 0.1429, u 2 = 0.1429, u 2 = 0, u 4 = 0.1429,<br />
u 5 = 0.2857, u 6 = 0.4286, u 7 = 0, u 8 = 0.4286, u 9 = 1<br />
3. If the rat starts in com<strong>part</strong>ment 1, how many times it can visit room 5 before it is absorbed?<br />
Let w i5 = E{ ∑ T −1<br />
n=0 I(X n = 5)|X 0 = i} we have<br />
w 15 = p 11 w 15 + p 12 w 25 + p 13 w 35 + p 14 w 45 + p 15 w 55 + p 16 w 65 + p 17 w 75 + p 18 w 85 + p 19 w 95<br />
w 25 = p 21 w 15 + p 22 w 25 + p 23 w 35 + p 24 w 45 + p 25 w 55 + p 26 w 65 + p 27 w 75 + p 28 w 85 + p 29 w 95<br />
w 35 = 0<br />
w 45 = p 41 w 15 + p 42 w 25 + p 43 w 35 + p 44 w 45 + p 45 w 55 + p 46 w 65 + p 47 w 75 + p 48 w 85 + p 49 w 95<br />
w 55 = 1 + p 41 w 15 + p 42 w 25 + p 43 w 35 + p 44 w 45 + p 45 w 55 + p 46 w 65 + p 47 w 75 + p 48 w 85 + p 49 w 95<br />
w 65 = p 41 w 15 + p 42 w 25 + p 43 w 35 + p 44 w 45 + p 45 w 55 + p 46 w 65 + p 47 w 75 + p 48 w 85 + p 49 w 95<br />
w 75 = 0<br />
w 85 = p 41 w 15 + p 42 w 25 + p 43 w 35 + p 44 w 45 + p 45 w 55 + p 46 w 65 + p 47 w 75 + p 48 w 85 + p 49 w 95<br />
w 95 = 0<br />
or<br />
w 15 = 0.5w 25 + 0.5w 45<br />
w 25 = 1 3 w 15 + 1 3 w 35 + 1 3 w 55<br />
3
w 35 = 0<br />
w 45 = 1 3 w 15 + 1 3 w 55 + 1 3 w 75<br />
w 55 = 1 + 1 4 w 25 + 1 4 w 45 + 1 4 w 65 + 1 4 w 85<br />
w 65 = 1 3 w 35 + 1 3 w 55 + 1 3 w 95<br />
w 75 = 0<br />
w 85 = 1 3 w 55 + 1 3 w 75 + 1 3 w 95<br />
w 95 = 0<br />
[You can find the answers by solving these set of equations]<br />
4. If the rat starts in com<strong>part</strong>ment 1, what is the probability that the rat is in 4 before it is<br />
shocked?<br />
Let<br />
z i = P (X T −1 = 4|X 0 = i)<br />
then<br />
z 1 = p 12 z 2 + p 14 z 4<br />
z 2 = p 21 z 1 + p 25 z 5 + p 23 z 3<br />
z 3 = 0;<br />
z 4 = p 41 z 1 + p 45 z 5 + p 47<br />
z 5 = p 52 z 2 + p 54 z 4 + p 56 z 6 + p 58 z 8<br />
z 6 = p 63 z 3 + p 65 z 5 + p 69 z 9<br />
z 7 = 0<br />
z 8 = p 85 z 5 + p 87 0 + p 89 0<br />
z 9 = 0<br />
Example A model of Fecundity Changes in sociological patterns such as increase in age at<br />
marriage, more remarriages after widowhood, and increased divorce rates have profound effects<br />
on overall population growth rates. Here we attempt to model the life span of a female in a<br />
population in order to provide a framework for analyzing the effect of social changes on average<br />
fecundity.<br />
For a typical woman, we may categorize her in one of the follows states<br />
E 0 : Prepuberty E 1 : Single E 2 : Married E 3 : Divorced,<br />
4
E 4 : Widowed<br />
E 5 : Died or emigration from the population<br />
Suppose the transition probability matrix is<br />
P =<br />
E 0 E 1 E 2 E 3 E 4 E 5<br />
E 0 0 0.9 0 0 0 0.1<br />
E 1 0 0.5 0.4 0 0 0.1<br />
E 2 0 0 0.6 0.2 0.1 0.1<br />
E 3 0 0 0.4 0.5 0 0.1<br />
E 4 0 0 0.4 0 0.5 0.1<br />
E 5 0 0 0 0 0 1<br />
We are interested in the mean duration spent in state E 2 , Married, since this corresponds to<br />
the state of maximum fecundity.<br />
Let w i2 be the mean duration in state E 2 given the initial state is E i .<br />
From the first step analysis, we have<br />
w 22 = 1 + p 21 w 12 + p 22 w 22 + p 23 w 32 + p 24 w 42 + p 22 w 52 .<br />
For absorbing state 5, we have<br />
w 52 = 0<br />
If i is not a state 2 and absorbing state 5,<br />
All together, we have<br />
w i2 = p i1 w 12 + p i2 w 22 + p i3 w 32 + p i4 w 42 + p i2 w 52<br />
w 02 = 0w 02 + 0.9w 12 + 0w 22 + 0w 32 + 0.1w 42 + 0.1w 52<br />
w 12 = w 02 + w 12 + w 22 + w 32 + w 42 + 0w 52<br />
w 22 = 1 + w 02 + w 12 + w 22 + w 32 + w 42 + 0w 52<br />
w 32 = w 02 + w 12 + w 22 + w 32 + w 42 + 0w 52<br />
w 42 = w 02 + w 12 + w 22 + w 32 + w 42 + 0w 52<br />
w 52 = 0<br />
The solution is<br />
w 02 = 4.5, w 12 = 5, w 22 = 6.25, w 32 = 5, w 42 = 5, w 52 = 0.<br />
Each female, on the average, spend w 02 = 4.5 periods in the childbearing state E 2 during her<br />
lifetime.<br />
5
E i .<br />
Let z i be the proportion of women before she reach E 5 she is single given the initial state is<br />
z 0 = p 01 z 1 + p 05 0<br />
z 1 = p 11 z 1 + p 12 z 2 + p 15 1<br />
z 2 = p 22 z 2 + p 23 z 3 + p 24 z 4 + p 25 0<br />
z 3 = p 32 z 2 + p 33 z 3 + p 35 0<br />
z 4 = p 42 z 2 + p 44 z 4 + p 45 0<br />
z 5 = 0<br />
we have<br />
z 0 = 0.0643, z 1 = 0.0714, z 2 = 0, z 3 = 0z 4 = 0, z 5 = 0<br />
Example [A process with short-term memory, e.g. the weather depends on the past m-days]<br />
We constrain the weather to two states s: sunny, s: cloudy<br />
- - - s c s s c s - -<br />
X n−1 X n X n+1<br />
Suppose that given the weathers in the previous two days, we can predict the weather in the<br />
following day as<br />
sunny (yesterday) + sunny (today) =⇒ sunny (tomorrow) with probability 0.8;<br />
cloudy (tomorrow) with probability 0.2;<br />
cloudy (yesterday)+sunny (today) =⇒ sunny (tomorrow) with probability 0.6;<br />
cloudy (tomorrow) with probability 0.4;<br />
sunny (yesterday)+cloudy (today) =⇒ sunny (tomorrow) with probability 0.4;<br />
cloudy (tomorrow) with probability 0.6;<br />
cloudy (yesterday)+ cloudy (today) =⇒ sunny (tomorrow) with probability 0.1;<br />
cloudy (tomorrow) with probability 0.9;<br />
Let X n be the weather of the n’th day. Then the state space is<br />
S = {s, c}<br />
6
We have<br />
P (x n+1 = s|x n−1 = s, x n = s) = 0.8<br />
P (x n+1 = s|x n−1 = c, x n = s) = 0.6<br />
Therefore {X t } is not a MC .<br />
Let Y n = (X n−1 , X n ) be the weather of the day and the previous day. Then the state space<br />
for {Y n } is<br />
S = {(s, s), (c, s), (s, c), (c, c)}<br />
Then {Y t } is a MC with transition probability matrix<br />
P =<br />
(s,s) (s,c) (c,s) (c,c)<br />
(s,s) 0.8 0.2 0 0<br />
(s,c) 0 0 0.4 0.6<br />
(c,s) 0.6 0.4 0 0<br />
(c,c) 0 0 0.1 0.9<br />
Suppose that in the past two days the weathers are (c, c). In how many days on average can<br />
we expect to have two successive sunny days?<br />
To solve the question, we define a new MC by recording the weathers in successive days.<br />
If there are two successive sunny days, we stop. Denote the process by {Z n }. The transition<br />
probability matrix is then<br />
P =<br />
(s,s) (s,c) (c,s) (c,c)<br />
(s,s) 1 0 0 0<br />
(s,c) 0 0 0.4 0.6<br />
(c,s) 0.6 0.4 0 0<br />
(c,c) 0 0 0.1 0.9<br />
Denote the states (s,s), (s, c), (c,s) and (c,c) by 1,2,3 and 4 respectively. Let v i denote the<br />
expected number of days we first have two sunny days?<br />
By the first step analysis, we have the following equations<br />
⎧<br />
v 1 = 0<br />
⎪⎨<br />
v 2 = 1 + p 21 v 1 + p 22 v 2 + p 23 v 3 + p 24 v 4<br />
v ⎪⎩ 3 = 1 + p 31 v 1 + p 32 v 2 + p 33 v 3 + p 34 v 4<br />
v 4 = 1 + p 41 v 1 + p 42 v 2 + p 43 v 3 + p 44 v 4<br />
i,e.<br />
⎧<br />
⎪⎨<br />
⎪⎩<br />
v 1 = 0<br />
v 2 = 1 + 0.4v 3 + 0.6v 4<br />
v 3 = 1 + 0.6v 1 + 0.4v 2<br />
v 4 = 1 + 0.1v 3 + 0.9v 4<br />
7
we have<br />
v 2 = 13.3333, v 3 = 6.3333, v 4 = 16.3333<br />
is<br />
Example[understanding T] Consider a <strong>Markov</strong> Chain whose transition probability matrix<br />
P =<br />
1 2 3<br />
1 a b c<br />
2 d e f<br />
3 0 0 1<br />
The MC starts at time 0 with X 0 = 0. Let T = min{n ≥ 0 : X n = 3} Find P (X 3 = 0|X 0 =<br />
0; T > 3)<br />
Example [understanding one-step analysis] Consider a <strong>Markov</strong> Chain whose transition probability<br />
matrix is<br />
P =<br />
The MC starts at time 0 with X0 = 2.<br />
1 2 3 4<br />
1 1 0 0 0<br />
2 0.2 0.2 0.2 0.4<br />
3 0.2 0.3 0.4 0.1<br />
4 0 0 0 1<br />
1. What is the probability that when the process is absorbed, it does so from state 2?<br />
2. What is the probability that when the process is absorbed by state 4, it does so from state<br />
2?<br />
8