11.07.2015 Views

Pairwise Markov Random Fields and its Application in Textured ...

Pairwise Markov Random Fields and its Application in Textured ...

Pairwise Markov Random Fields and its Application in Textured ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Pairwise</strong> <strong>Markov</strong> <strong>R<strong>and</strong>om</strong> <strong>Fields</strong> <strong>and</strong> <strong>its</strong> <strong>Application</strong> <strong>in</strong> <strong>Textured</strong> ImagesSegmentationWojciech Pieczynski <strong>and</strong> Abdel-Nasser TebbacheDépartement Signal et ImageInstitut National des Télécommunications,9, rue Charles Fourier, 91000 Evry, FranceE-mail Wojciech.Pieczynski@<strong>in</strong>t-evry.frAbstractThe use of r<strong>and</strong>om fields, which allows one to take <strong>in</strong>toaccount the spatial <strong>in</strong>teraction among r<strong>and</strong>om variables <strong>in</strong>complex systems, is a frequent tool <strong>in</strong> numerous problemsof statistical image process<strong>in</strong>g, like segmentation or edgedetection.In statistical image segmentation, the model is generallydef<strong>in</strong>ed by the probability distribution of the class field,which is assumed to be a <strong>Markov</strong> field, <strong>and</strong> the probabilitydistributions of the observations field conditional to theclass field. In such models the segmentation of texturedimages is difficult to perform <strong>and</strong> one has to resort to somemodel approximations. The orig<strong>in</strong>ality of our contributionis to consider the markovianity of the pair (class field,observations field). We obta<strong>in</strong> a different model; <strong>in</strong>particular, the class field is not necessarily a <strong>Markov</strong> field.The model proposed makes possible the use of Bayesianmethods like MPM or MAP to segment textured imageswith no model approximations. In addition, the texturedimages can be corrupted with correlated noise. Some firstsimulations to validate the model proposed are alsopresented.1. IntroductionWe propose <strong>in</strong> this paper a new <strong>Pairwise</strong> <strong>Markov</strong> <strong>R<strong>and</strong>om</strong>Field model (PMRF), which is orig<strong>in</strong>al with respect to theclassical Hidden <strong>Markov</strong> <strong>R<strong>and</strong>om</strong> Field model (HMRF)[ChJ93]. The ma<strong>in</strong> difference is that <strong>in</strong> a PMRF the classfield is not necessarily a <strong>Markov</strong> field. One advantage,which will be developed <strong>in</strong> the follow<strong>in</strong>g, is that texturedimages can be segmented without any modelapproximation.To be more precise, let S be the set of pixels, X = (X s) s∈Sthe r<strong>and</strong>om field of classes, <strong>and</strong> Y = (Y s) s∈Sthe r<strong>and</strong>omfield of observations. The problem of segmentation is thenthe problem of estimat<strong>in</strong>g the realizations of the field Xfrom the observations of the realizations of the field Y .Prior to consider<strong>in</strong>g any statistical segmentation method,one must def<strong>in</strong>e the probability distribution P (X,Y)of ther<strong>and</strong>om field (X,Y).In the classical HMRF model this distribution is given bythe distribution of X , which is a <strong>Markov</strong> field, <strong>and</strong> the setX=xP Yof the distributions of Y conditional to X = x .Under some assumptions on P X=x Y, the posteriorY=ydistribution P Xof X is a <strong>Markov</strong> distribution <strong>and</strong>different Bayesian segmentation techniques like MPM[MMP87], MAP [GeG84] , or ICM [Bes86] can beapplied. These assumptions can turn out to be difficult tojustify when deal<strong>in</strong>g with textured images, as detailed <strong>in</strong>[DeE87], [KDH88], [WoD92].The idea of the PMRF model we propose is to considerdirectly the <strong>Markov</strong>ianity of the pairwise r<strong>and</strong>om fieldZ = (X,Y). The distribution of X is then the marg<strong>in</strong>aldistribution of P (X,Y)<strong>and</strong> thus it is not necessarily a<strong>Markov</strong>ian distribution. What counts is that P Y=yXrema<strong>in</strong>s<strong>Markov</strong>ian <strong>and</strong> so different Bayesian segmentationsspecified above can be used. When the particular problemof segment<strong>in</strong>g textured images with correlated noise isconsidered, the PMRF model is quite suitable becausedifferent Bayesian segmentation methods above can beapplied without any model approximation.The organization of the paper is follow<strong>in</strong>g. In the nextsection we specify the difference between HMRF <strong>and</strong>PMRF <strong>in</strong> the case of simple Gaussian noise fields. Section3 is devoted to some simulations <strong>and</strong> some conclud<strong>in</strong>gremarks are presented <strong>in</strong> section 4.2. <strong>Pairwise</strong> <strong>Markov</strong> <strong>R<strong>and</strong>om</strong> Field <strong>and</strong>textured image segmentation2.1 Simple case of Hidden <strong>Markov</strong> FieldLet us consider the set of pixels S <strong>and</strong> X = (X s) s∈Sar<strong>and</strong>om field, each r<strong>and</strong>om variable X stak<strong>in</strong>g <strong>its</strong> values <strong>in</strong>the class set Ω={ω 1,ω 2}. The field X is <strong>Markov</strong>ian withrespect to four nearest neighbors if <strong>its</strong> distribution iswritten


⎡⎤P[X = x] = λ exp⎢− ∑ ϕ 1(x s, x t) − ∑ ϕ 2(x s) ⎥⎣⎢(s,t)neighborss ⎦⎥(2.1)where " (s,t) neighbors" means that the pixels s <strong>and</strong> t areneighbors <strong>and</strong> lie either on a common row or on a commoncolumn. The r<strong>and</strong>om field Y = (Y s) s∈Sis the field ofobservations <strong>and</strong> we assume that each Y stakes <strong>its</strong> values <strong>in</strong>R. The distribution of (X,Y) is then def<strong>in</strong>ed by (2.1) <strong>and</strong>the distributions of Y conditional on X = x . Assum<strong>in</strong>gthat the r<strong>and</strong>om variables (Y s) are <strong>in</strong>dependent conditionallto X <strong>and</strong> that the distribution of each Y sconditional toX = x is equal to <strong>its</strong> distribution conditional to X s= x s,we have :P[Y = yX= x] = ∏ f xs(y s)(2.2)swhere f xsis the density of the distribution of Y sconditional to X s= x s. Thus :P[X = x,Y = y] == λ exp[− ∑ ϕ 1(x s, x t) − ∑ [ϕ 2(x s) + Logf xs(y s)]] (2.3)(s,t)neighborssSo the pairwise field (X,Y) distribution is <strong>Markov</strong>ian <strong>and</strong>the distribution of X conditional to Y = y is still<strong>Markov</strong>ian. It is then possible to simulate realizations ofX accord<strong>in</strong>g to <strong>its</strong> distribution conditional to Y = y ,which affords the use of Bayesian segmentation techniqueslike MPM or MAP.In practice, the r<strong>and</strong>om variables (Y s) are not, <strong>in</strong> general,<strong>in</strong>dependent conditionally on X . In particular, (2.2) is toosimple to allow one to take texture <strong>in</strong>to account. For<strong>in</strong>stance, if we consider that texture is a Gaussian <strong>Markov</strong>r<strong>and</strong>om field realization [CrJ83], (2.2) should be replacedwith :P[Y = yX= x] == λ (x) exp[− ∑ a xs x ty sy t− 1(s,t)neighbors 22∑ [a xs x sy s+ b xsy s]] (2.4)The field Y is then <strong>Markov</strong>ian conditionally on X , whichmodels textures. The drawback is that the product of (2.1)with (2.4) is not, <strong>in</strong> general, a <strong>Markov</strong> distribution. In fact,for the covariance matrix Γ(x) of the Gaussian distributionof Y = (Y s) s∈S(conditional to X = x ), we have :s<strong>Markov</strong>ianity <strong>in</strong>validates the rigorous application of MPMor MAP.2.2 Simple case of <strong>Pairwise</strong> <strong>Markov</strong> FieldTo circumvent the difficulties above we propose to considerthe <strong>Markov</strong>ianity of (X,Y). Specifically, we putP[X = x,Y = y] == λ exp[− ∑ ϕ[(x s, y s),(x t, y t)] − ∑ ϕ *[(x s, y s)]] =(s,t)neighborss= λ exp[− ∑ [ϕ 1(x s, x t) + a xs x ty sy t+ b xs x ty s+(2.6)(s,t)neighbors2+c xs x ty t] − ∑ [ϕ 2(x s) + a xs x sy s+ b xsy s]]sThe <strong>Markov</strong>ianity of the pairwise field (X,Y) implies the<strong>Markov</strong>ianity of Y conditionally on X , <strong>and</strong> the<strong>Markov</strong>ianity of X conditionally on Y . The first propertyallows one to model textures, as <strong>in</strong> (2.4), <strong>and</strong> the secondone makes possible to simulate X accord<strong>in</strong>g to <strong>its</strong>posterior distribution, which allows us to use Bayesiansegmentation methods like MPM or MAP.Let us briefly specify how to simulate realizations of thepair (X,Y). The pair (X,Y) be<strong>in</strong>g <strong>Markov</strong>ian, we canspecify the distribution of each (X s,Y s) conditionally on<strong>its</strong> neighbors. Let us consider the calculus of thedistribution of (X s,Y s) conditional to the four nearestneighbors :[(X t1,Y t1),(X t2,Y t2),(X t3,Y t3),(X t4,Y t4)] =[(x t1, y t1),(x t2, y t2),(x t3, y t3),(x t4, y t4)]This distribution can be written as(2.7)h(x s, y s) = p(x s) f xs(y s) (2.8)where p is a probability on the set of classes <strong>and</strong>, for eachclass x s, f xsis the density of the distribution of Y sconditional to X s= x s( p <strong>and</strong> f xsalso depend on(x t1, y t1),(x t2, y t2),(x t3, y t3),(x t4, y t4), which are fixed <strong>in</strong> thefollow<strong>in</strong>g <strong>and</strong> so will be omitted). (2.8) makes thesampl<strong>in</strong>g of (X s,Y s) quite easy: one samples x saccord<strong>in</strong>gto p, <strong>and</strong> then y saccord<strong>in</strong>g to f xs. We thus have:P{( X s,Y s) = (x s, y s)λ (x) = [(2π) N det(Γ(x))] −1/2 (2.5)which is not, <strong>in</strong> general, a <strong>Markov</strong> distribution withrespect to x .F<strong>in</strong>ally, X is <strong>Markov</strong>ian, Y is <strong>Markov</strong>ian conditionallyon X , but neither (X,Y), nor X conditionally on Y , are<strong>Markov</strong>ian <strong>in</strong> general. This lack of the posterior[(X t1,Y t1),...,(X t4,Y t4)] = [(x t1, y t1),...,(x t4, y t4)]}∝ exp[− ∑ ϕ[(x s, y s),(x ti, y ti)] − ϕ *[(x s, y s)] =i=1,..., 4= exp[− ∑ [ϕ 1(x s, x ti) + a xs x tiy sy ti+ b xs x tiy s+i=1,..., 4+c xs x tiy ti] − [ϕ 2(x s) + a xs x sy 2 s+ b xsy s]](2.9)


This can be written h(x s, y s) = p(x s) f xs(y s), where f xsisa Gaussian density def<strong>in</strong>ed by the follow<strong>in</strong>g mean M xs<strong>and</strong>2variance σ xs:M xs=−∑b xs+ (a xs x tiy ti+ b xs x ti)i=1,..., 42a xs x s, σ xs2 =12a xs x s(2.10)<strong>and</strong> p the probability given on the set of classes with :p(x s) ∝(a xs x s) −1 exp[(b xs+ ∑ a xs x tiy ti+ b xs x ti) 2i=1,..., 44a xs x s−ϕ 2(x s) − ∑ (ϕ 1(x s, x ti) + c xs x tiy ti)]i=1,..., 4−(2.11)F<strong>in</strong>ally, the ma<strong>in</strong> differences between the classical HMRFmodel <strong>and</strong> the new PMRF we propose are :(i) The distribution of X (<strong>its</strong> prior distribution) is<strong>Markov</strong>ian <strong>in</strong> HMRF <strong>and</strong> is not necessarily <strong>Markov</strong>ian <strong>in</strong>PMRF;(ii) The posterior distribution of X is not necessarily<strong>Markov</strong>ian <strong>in</strong> HMRF <strong>and</strong> is <strong>Markov</strong>ian <strong>in</strong> PMRF;(iii) In the case of case of images which are textured, <strong>and</strong>possibly corrupted with correlated noise, the PMRF allowsone to apply the Bayesian MPM or MAP methods withoutany model approximations.Remarks1. The classical HMRF can also be applied <strong>in</strong> the case ofmultisensor images [YaG95]. For m sensors theobservations on each pixel s ∈S are then assumed to be arealization of a r<strong>and</strong>om vector Y s= Y 1 m[ s,...,Y s ]. It ispossible to consider a multisensor PRMF. For example, amultisensor PMRF would be obta<strong>in</strong>ed by replac<strong>in</strong>g <strong>in</strong> (2.6)2y sy t<strong>and</strong> y sby φ 1 [(y 1 s,...,y m s),(y 1 t,...,y m t)] <strong>and</strong>φ 2 [(y 1 s,...,y m s)].2. In some particular cases the classical model allows oneto take correlated noise <strong>in</strong>to account [Guy93], [Lee98]. Thedifficulties arise when wish<strong>in</strong>g to consider the cases whencorrelations vary with classes.2.3 General case of PMRFGeneraliz<strong>in</strong>g of the distribution given by (2.6) does notpose any problem. Let us consider k classesΩ={ω 1,...,ω k}, m sensors (each Y s= (Y 1 s,...,Y m s) takes<strong>its</strong> values <strong>in</strong> R m ), <strong>and</strong> C a set of cliques def<strong>in</strong>ed by someneighborhood system. The r<strong>and</strong>om field Z = (Z s) s∈S, withZ s= (X s,Y s), is a <strong>Pairwise</strong> <strong>Markov</strong> <strong>R<strong>and</strong>om</strong> Field if <strong>its</strong>distribution may be written asP[Z = z] = λ exp[− ∑ ϕ c (z c)] (2.12)c∈CLet us note that the existence of the distribution (2.12) isnot ensured <strong>in</strong> a general case.In particular, three sensor PMRF can be used to segmentcolour images.RemarkAs mentioned above, X is not necessarily <strong>Markov</strong>ian <strong>in</strong> aPMRF. This could be felt as a drawback, because thedistribution of X models the "prior" , i.e., without anyobservation, knowledge we have about the class image. Ofcourse, this "prior" distribution of X also exists <strong>in</strong> PMRF(it is the marg<strong>in</strong>al distribution of P (X,Y)), but it is not<strong>Markov</strong>ian. The gravity of this "drawback" is undoubtedlydifficult to discuss <strong>in</strong> the general case. However, suppos<strong>in</strong>gthat it really is a drawback, let us mention that it alsoexists <strong>in</strong> the classical HMRF model. In fact, even <strong>in</strong> thevery simple case def<strong>in</strong>ed by (2.3), the observation field Yis not a <strong>Markov</strong> field. So, one could consider that the non<strong>Markov</strong>ianity of X <strong>in</strong> the PMRF model is not strangerthat the non <strong>Markov</strong>ianity of Y <strong>in</strong> the classical HMRFmodel.3. Visual examplesWe present <strong>in</strong> this section two simulations <strong>and</strong> twosegmentations by the MPM. We consider rather noisycases : one can hardly dist<strong>in</strong>guish the class image <strong>in</strong> thenoisy one. One can notice that although the class imagesare not <strong>Markov</strong> fields realizations, they look like.The two realizations of PMRF presented <strong>in</strong> Fig.1 are<strong>Markov</strong>ian with respect to four nearest neighbors; thedistribution of (X,Y) is written :P[X = x,Y = y] == λ exp[− ∑ ϕ[(x s, y s),(x t, y t)] − ∑ ϕ *[(x s, y s)]] (3.1)(s,t)neighborsswithϕ[(x s, y s),(x t, y t)] = 1 2 (a x s x ty sy t+ b xs x ty s+ c xs x ty t+ d xs x t)(3.2)ϕ *[(x s, y s)] = 1 2 (α x sy 2 s+ β xsy s+ γ xs x t)The different coefficients <strong>in</strong> (3.2) are given <strong>in</strong> Tab.1. Let usnotice that it is <strong>in</strong>terest<strong>in</strong>g, <strong>in</strong> order to have an idea aboutthe noise level, to dispose of some <strong>in</strong>formation about thedistributions of Y conditional to X = x . In fact, these are


Gaussian distributions <strong>and</strong> know<strong>in</strong>g some parameters likemeans <strong>and</strong> variances can provide some <strong>in</strong>formation aboutthe noise level. Of course, the noise level also depends ondifferent correlations <strong>and</strong> the prior distribution of X .We may note that some of the coefficients <strong>in</strong> (3.2) aresimply l<strong>in</strong>ked with means or variances of the distributionsof Y conditional to X = x . In fact, denot<strong>in</strong>g by Σ xthecovariance matrix of the Gaussian distribution of Yconditional to X = x <strong>and</strong> putt<strong>in</strong>g Q x= [q x st] s,t∈S=Σ −1 x, wehave :⎡P[Y = yX= x] ∝ exp − (y − m x )t Q x(y − m x) ⎤⎢⎣ 2⎥ (3.3)⎦Develop<strong>in</strong>g (3.3) <strong>and</strong> identify<strong>in</strong>g to (3.2) we obta<strong>in</strong>Image 3 Image 6Fig. 1. Two realizations of <strong>Pairwise</strong> <strong>Markov</strong> <strong>Fields</strong> (Image1, Image 2), (Image 4, Image 5), <strong>and</strong> the MPMsegmentations of Image 2 (giv<strong>in</strong>g Image 3), <strong>and</strong> Image 5(giv<strong>in</strong>g Image 6), respectively.m xs=− β x s, σ 2 xs= 1(3.4)2α xsα xsImages 1, 2, 3 Images 4, 5, 6α xs1 1So, all other parameters be<strong>in</strong>g fixed, one can use (3.4) tovary the noise level. For <strong>in</strong>stance, keep<strong>in</strong>g the samevariances the noise level <strong>in</strong>creases when one makes themeans approach each other. Otherwise, there are no simplel<strong>in</strong>ks between correlations of the r<strong>and</strong>om variables (Y s)(conditionally on X = x ) <strong>and</strong> the coefficients <strong>in</strong> (3.2). Thecorrelations <strong>in</strong> Table 1, whose variations make appeardifferent textures, are estimated ones. The values of themeans show that the level of the noise is rather strong,which is confirmed visually.Image 1 Image 4Image 2 Image 5β xs−2m xs−2m xsγ xs x t2m xs2m xsa xs x t−0, 4 −0,1b xs x t−0, 4m xt−0,1m xtc xs x t−0, 4m xs−0,1m xsd xs x t0, 4m xsm xt+ ϕ(x s, x t) −0,1m xsm xt+ ϕ(x s, x t)m 1−0,3 1m 20,3 1,52σ 11 12σ 21 1ρ 110,26 0,05ρ 220,26 0,07τ 13,1% 07,9%Nb 30 × 30 30 × 30Tab.1α xs, ... , d xs x t: functions <strong>in</strong> (3.2), the function ϕ be<strong>in</strong>gdef<strong>in</strong>ed by ϕ(x s, x t) =−1 if x s= x t<strong>and</strong> ϕ(x s, x t) = 1 ifx s≠ x t. m 1, m 2, σ 2 1, σ 2 2: the means <strong>and</strong> the variances <strong>in</strong>(3.3). ρ 11, ρ 22: the estimated covariances <strong>in</strong>ter-class(neighbor<strong>in</strong>g pixels). τ : the error rate of wronglyclassified pixels with MPM. Nb = n 1n 2: the number ofiterations <strong>in</strong> MPM (the posterior marg<strong>in</strong>als estimated fromn 1realizations, each realization obta<strong>in</strong>ed after n 2iterationsof the Gibbs Sampler).4 ConclusionsWe proposed <strong>in</strong> this paper an novel model called <strong>Pairwise</strong><strong>Markov</strong> <strong>R<strong>and</strong>om</strong> Field (PMRF). A r<strong>and</strong>om field of classesX <strong>and</strong> a r<strong>and</strong>om field of observations Y form a PMRFwhen the pairwise r<strong>and</strong>om field Z = (X,Y) is a <strong>Markov</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!