12.07.2015 Views

Linear Algebra

Linear Algebra

Linear Algebra

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

TopicMarkov ChainsHere is a simple game: a player bets on coin tosses, a dollar each time, and thegame ends either when the player has no money or is up to five dollars. If theplayer starts with three dollars, what is the chance that the game takes at leastfive flips? Twenty-five flips?At any point, this player has either $0, or $1, . . . , or $5. We say that theplayer is in the state s 0 , s 1 , . . . , or s 5 . In the game the player moves from stateto state. For instance, a player now in state s 3 has on the next flip a 0.5 chanceof moving to state s 2 and a 0.5 chance of moving to s 4 . The boundary statesare a bit different; a player never leaves state s 0 or state s 5 .Let p i (n) be the probability that the player is in state s i after n flips. Then,for instance, we have that the probability of being in state s 0 after flip n + 1 isp 0 (n + 1) = p 0 (n) + 0.5 · p 1 (n). This matrix equation summarizes.⎛⎞ ⎛ ⎞ ⎛ ⎞1.0 0.5 0.0 0.0 0.0 0.0 p 0 (n) p 0 (n + 1)0.0 0.0 0.5 0.0 0.0 0.0p 1 (n)p 1 (n + 1)0.0 0.5 0.0 0.5 0.0 0.0p 2 (n)p 2 (n + 1)=0.0 0.0 0.5 0.0 0.5 0.0p⎜⎟ ⎜ 3 (n)p⎟ ⎜ 3 (n + 1)⎟⎝0.0 0.0 0.0 0.5 0.0 0.0⎠⎝p 4 (n) ⎠ ⎝p 4 (n + 1) ⎠0.0 0.0 0.0 0.0 0.5 1.0 p 5 (n) p 5 (n + 1)With the initial condition that the player starts with three dollars, these arecomponents of the resulting vectors.n = 0 n = 1 n = 2 n = 3 n = 4 · · · n = 24000100000.500.5000.2500.500.250.12500.37500.250.250.1250.187500.312500.3750.396000.0027600.0044700.59676This exploration suggests that the game is not likely to go on for long, withthe player quickly moving to an ending state. For instance, after the fourth flipthere is a 0.50 probability that the game is already over.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!