13.07.2015 Views

Distributing labels on infinite trees

Distributing labels on infinite trees

Distributing labels on infinite trees

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

called A n+1i . Now, each sub-tree A n+1i is composed of a root and d factors of size n, in the set{A n 1 , . . . A n p }. In turn, they are all prol<strong>on</strong>ged into <strong>trees</strong> of size n in a unique way. Therefore,P(n + 2) = p. By a direct inducti<strong>on</strong>, P(k) = p for all k ≥ n.4 implies 1: If the number of factors of size n is smaller than B for all n, then this means thatthe number of equivalence classes for ≡ n is smaller than n for all n, this means that G(T ) hasless than B nodes.3.1 Density of rati<strong>on</strong>al <strong>trees</strong>Let T be a rati<strong>on</strong>al tree and let G(T ) be its minimal multi-graph. The nodes of G(T ) arenumbered v 1 · · · , v K , with v 1 corresp<strong>on</strong>ding to the root of T .G(T ) can be seen as the transiti<strong>on</strong> kernel of a Markov chain by c<strong>on</strong>sidering each arc of G(T )as a transiti<strong>on</strong> with probability 1/d.If G(T ) is irreducible then the Markov chain admits a unique stati<strong>on</strong>ary measure π <strong>on</strong> itsnodes. The density of T and the stati<strong>on</strong>ary measure π are related by the following theorem.Theorem 3.2. Let T be an irreducible rati<strong>on</strong>al tree with a minimal multigraph G(T ) with Knodes. Let l = (l 1 , . . . l K ) be the <str<strong>on</strong>g>labels</str<strong>on</strong>g> of the nodes of G(T ) and let π = (π 1 , . . . , π K ) be thestati<strong>on</strong>ary measure over the nodes of G(T ).If T is aperiodic, then T admits a density α = πl t .If T is periodic with period p then T admits an average density α = πl t .Proof. Let V n be a Markov chain corresp<strong>on</strong>ding to G(T ). Since G(T ) is irreducible, V n admits aunique stati<strong>on</strong>ary measure , say π = (π 1 , . . . , π K ). Let us call P the kernel of this Markov chain:P i,j = a/d if there are a arcs in G(T ) from v i to v j .Now, let us c<strong>on</strong>sider all the paths of length n in T , starting from an arbitrary node v i . Byc<strong>on</strong>structi<strong>on</strong> of G(T ), the number of paths that end up in the nodes v 1 , · · · , v K respectively, ofG(T ), is given by the vector e i d n P n , where e i is the vector with all its coordinates equal to 0except the ith coordinate, equal to 1.∑ n−1Now, the number of <strong>on</strong>es in the tree of height n starting in v i is h n (v i ) = e i k=0 dk P k l t .Let us first c<strong>on</strong>sider the case where P is aperiodic. We denote by Π the matrix with all itslines equal to the stati<strong>on</strong>ary measure, π and by D k the matrix P k − Π. When P is aperiodic,then lim k→∞ ||D k || 1 = 0. Therefore, for all k > n, ||D k || 1 < ɛ n → 0.Then the density of <strong>on</strong>es d 2n (v i ) =size 2n into a factor of size n at the root and d. factors of size n. One getsd−1d 2n −1 h 2n(v i ) can be estimated by splitting the factors ofd 2n (v i ) ==d − 1d 2n − 1 e id − 1d 2n − 1 e i(n∑k=1d k P k l t + d − 1d 2n − 1 e in∑d k P k +k=12n−1∑k=n+1d k D k +2n−1∑k=n+12n∑k=n+1d k P k l t ,d k Π)l t .∑ nwhen n goes to infinity, the first term goes to 0 because e i k=1 dk P k l t ≤ d n+1 . As for thesec<strong>on</strong>d termd−1d 2n −1 e ∑ 2n−1i k=n+1 dk D k l t ≤ 1d 2n −1 d2n ɛ n . This goes to 0 when n goes to infinity.d−1As for the last term,d 2n −1 e ∑ 2n−1i k=n+1 dk Πl t = 1d 2n −1 (d2n −d n+2 )(e i Π)l t this goes to πl t whenn goes to infinity.The same holds by computing the density of <strong>trees</strong> of size 2n + 1 by splitting them into thefirst n + 1 levels and the last n levels.This shows that the rooted density of all the <strong>trees</strong> in T is the same, equal to πl t .8

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!