11.07.2015 Views

Pairwise Markov Random Fields and its Application in Textured ...

Pairwise Markov Random Fields and its Application in Textured ...

Pairwise Markov Random Fields and its Application in Textured ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

⎡⎤P[X = x] = λ exp⎢− ∑ ϕ 1(x s, x t) − ∑ ϕ 2(x s) ⎥⎣⎢(s,t)neighborss ⎦⎥(2.1)where " (s,t) neighbors" means that the pixels s <strong>and</strong> t areneighbors <strong>and</strong> lie either on a common row or on a commoncolumn. The r<strong>and</strong>om field Y = (Y s) s∈Sis the field ofobservations <strong>and</strong> we assume that each Y stakes <strong>its</strong> values <strong>in</strong>R. The distribution of (X,Y) is then def<strong>in</strong>ed by (2.1) <strong>and</strong>the distributions of Y conditional on X = x . Assum<strong>in</strong>gthat the r<strong>and</strong>om variables (Y s) are <strong>in</strong>dependent conditionallto X <strong>and</strong> that the distribution of each Y sconditional toX = x is equal to <strong>its</strong> distribution conditional to X s= x s,we have :P[Y = yX= x] = ∏ f xs(y s)(2.2)swhere f xsis the density of the distribution of Y sconditional to X s= x s. Thus :P[X = x,Y = y] == λ exp[− ∑ ϕ 1(x s, x t) − ∑ [ϕ 2(x s) + Logf xs(y s)]] (2.3)(s,t)neighborssSo the pairwise field (X,Y) distribution is <strong>Markov</strong>ian <strong>and</strong>the distribution of X conditional to Y = y is still<strong>Markov</strong>ian. It is then possible to simulate realizations ofX accord<strong>in</strong>g to <strong>its</strong> distribution conditional to Y = y ,which affords the use of Bayesian segmentation techniqueslike MPM or MAP.In practice, the r<strong>and</strong>om variables (Y s) are not, <strong>in</strong> general,<strong>in</strong>dependent conditionally on X . In particular, (2.2) is toosimple to allow one to take texture <strong>in</strong>to account. For<strong>in</strong>stance, if we consider that texture is a Gaussian <strong>Markov</strong>r<strong>and</strong>om field realization [CrJ83], (2.2) should be replacedwith :P[Y = yX= x] == λ (x) exp[− ∑ a xs x ty sy t− 1(s,t)neighbors 22∑ [a xs x sy s+ b xsy s]] (2.4)The field Y is then <strong>Markov</strong>ian conditionally on X , whichmodels textures. The drawback is that the product of (2.1)with (2.4) is not, <strong>in</strong> general, a <strong>Markov</strong> distribution. In fact,for the covariance matrix Γ(x) of the Gaussian distributionof Y = (Y s) s∈S(conditional to X = x ), we have :s<strong>Markov</strong>ianity <strong>in</strong>validates the rigorous application of MPMor MAP.2.2 Simple case of <strong>Pairwise</strong> <strong>Markov</strong> FieldTo circumvent the difficulties above we propose to considerthe <strong>Markov</strong>ianity of (X,Y). Specifically, we putP[X = x,Y = y] == λ exp[− ∑ ϕ[(x s, y s),(x t, y t)] − ∑ ϕ *[(x s, y s)]] =(s,t)neighborss= λ exp[− ∑ [ϕ 1(x s, x t) + a xs x ty sy t+ b xs x ty s+(2.6)(s,t)neighbors2+c xs x ty t] − ∑ [ϕ 2(x s) + a xs x sy s+ b xsy s]]sThe <strong>Markov</strong>ianity of the pairwise field (X,Y) implies the<strong>Markov</strong>ianity of Y conditionally on X , <strong>and</strong> the<strong>Markov</strong>ianity of X conditionally on Y . The first propertyallows one to model textures, as <strong>in</strong> (2.4), <strong>and</strong> the secondone makes possible to simulate X accord<strong>in</strong>g to <strong>its</strong>posterior distribution, which allows us to use Bayesiansegmentation methods like MPM or MAP.Let us briefly specify how to simulate realizations of thepair (X,Y). The pair (X,Y) be<strong>in</strong>g <strong>Markov</strong>ian, we canspecify the distribution of each (X s,Y s) conditionally on<strong>its</strong> neighbors. Let us consider the calculus of thedistribution of (X s,Y s) conditional to the four nearestneighbors :[(X t1,Y t1),(X t2,Y t2),(X t3,Y t3),(X t4,Y t4)] =[(x t1, y t1),(x t2, y t2),(x t3, y t3),(x t4, y t4)]This distribution can be written as(2.7)h(x s, y s) = p(x s) f xs(y s) (2.8)where p is a probability on the set of classes <strong>and</strong>, for eachclass x s, f xsis the density of the distribution of Y sconditional to X s= x s( p <strong>and</strong> f xsalso depend on(x t1, y t1),(x t2, y t2),(x t3, y t3),(x t4, y t4), which are fixed <strong>in</strong> thefollow<strong>in</strong>g <strong>and</strong> so will be omitted). (2.8) makes thesampl<strong>in</strong>g of (X s,Y s) quite easy: one samples x saccord<strong>in</strong>gto p, <strong>and</strong> then y saccord<strong>in</strong>g to f xs. We thus have:P{( X s,Y s) = (x s, y s)λ (x) = [(2π) N det(Γ(x))] −1/2 (2.5)which is not, <strong>in</strong> general, a <strong>Markov</strong> distribution withrespect to x .F<strong>in</strong>ally, X is <strong>Markov</strong>ian, Y is <strong>Markov</strong>ian conditionallyon X , but neither (X,Y), nor X conditionally on Y , are<strong>Markov</strong>ian <strong>in</strong> general. This lack of the posterior[(X t1,Y t1),...,(X t4,Y t4)] = [(x t1, y t1),...,(x t4, y t4)]}∝ exp[− ∑ ϕ[(x s, y s),(x ti, y ti)] − ϕ *[(x s, y s)] =i=1,..., 4= exp[− ∑ [ϕ 1(x s, x ti) + a xs x tiy sy ti+ b xs x tiy s+i=1,..., 4+c xs x tiy ti] − [ϕ 2(x s) + a xs x sy 2 s+ b xsy s]](2.9)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!