Lecture Notes - Department of Mathematics and Statistics - Queen's ...
Lecture Notes - Department of Mathematics and Statistics - Queen's ...
Lecture Notes - Department of Mathematics and Statistics - Queen's ...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
40CHAPTER 4. MARTINGALES AND FOSTER-LYAPUNOV CRITERIA FOR STABILIZATION OF MARKOV CHAINS<br />
Corollary 4.2.1 In particular, under the conditions <strong>of</strong> Theorem 4.2.3,<br />
∫<br />
∫<br />
π(dx)f(x) =<br />
lets one obtain a bound on the expectation <strong>of</strong> f(x).<br />
S<br />
τ∑<br />
A−1<br />
π(dx)E x [ f(x t )] ≤<br />
t=0<br />
∫<br />
A<br />
π(dx)E x [V (x) + b].<br />
See chapter 14 <strong>of</strong> [23] for further details.<br />
We need to ensure that there exists an invariant measure however. This is why we require that f : X → [1, ∞),<br />
where 1 can be replaced with any positive number.<br />
4.2.3 Criterion for Recurrence<br />
Theorem 4.2.4 (Foster-Lyapunov for Recurrence) Let S be a compact set, b < ∞ <strong>and</strong> V (.) be an infcompact<br />
functional on X such that for all α ∈ R + {x : V (x) ≤ α} is compact. Let {x n } be an irreducible Markov<br />
chain on X. If the following is satisfied:<br />
∫<br />
P(x, dy)V (y) ≤ V (x) + b1 x∈S , ∀x ∈ X , (4.6)<br />
X<br />
then, with τ S = min(t > 0 : x t ∈ S), P x (τ S < ∞) = 1 for all x ∈ X, that is the Markov chain is Harris recurrent.<br />
Pro<strong>of</strong>: Let τ S = min(t > 0 : x t ∈ S). Define two stopping times: τ S <strong>and</strong> τ BN where B N = {x : V (x) ≥ N}. Note<br />
that a sequence defined by M t = V (x t ) (which behaves as a supermartingale for t = 0, 1, · · · , min(τ S , τ BN ) − 1)<br />
is uniformly integrable until τ BN , <strong>and</strong> a variation <strong>of</strong> the optional sampling theorem applies for stopping times<br />
which are not necessarily bounded by a given finite number with probability one. Note that, due to irreducibility,<br />
min(τ S , τ BN ) < ∞ with probability 1. Now, it follows that for x /∈ S ∪ B N , since when exiting into B N , the<br />
minimum value <strong>of</strong> the Lyapunov function is N:<br />
for some finite positive M. Hence,<br />
V (x) = E x [V (x min(τS,τ BN ))] ≥ P x (τ BN < τ S )N + P x (τ BN ≥ τ S )M,<br />
P x (τ BN < τ S ) ≤ V (x)/N<br />
We also have that P(min(τ S , τ BN ) = ∞) = 0, since the chain is irreducible <strong>and</strong> it will escape any compact set in<br />
finite time. As a consequence, we have that<br />
P x (τ S = ∞) ≤ P(τ BN < τ S ) ≤ V (x)/N<br />
<strong>and</strong> taking the limit as N → ∞, P x (τ S = ∞) = 0.<br />
⊓⊔<br />
If S is further petite, then once the petite set is visited, any other set with a positive measure (under the<br />
irreducibility measure) is visited with probability 1 infinitely <strong>of</strong>ten.<br />
⋄<br />
4.2.4 On small <strong>and</strong> petite sets<br />
Establishing petiteness may be difficult to directly verify. In the following, we present two conditions that may<br />
be used to establish the petiteness properties.<br />
By [23], p. 131: For a Markov chain with transition kernel P <strong>and</strong> K a probability measure on natural numbers,<br />
if there exists for every E ∈ B(X), a lower semi-continuous function N(·, E) such that ∑ ∞<br />
n=0 P n (x, E)K(n) ≥<br />
N(x, E), for a sub-stochastic kernel N(·, ·), the chain is called a T −chain.<br />
Theorem 4.2.5 [23] For a T −chain which is irreducible, every compact set is petite.