13.07.2015 Views

OKRESNÝ SÚD TOPOĽČANY

OKRESNÝ SÚD TOPOĽČANY

OKRESNÝ SÚD TOPOĽČANY

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

694 L.JOSEPH,D.B.WOLFSON,R.DUBERGERANDR.M.LYLEThe uncertainty in these parameter values is reflected in the choice of theirprior distributions. For simplicity, one may choose conjugate prior distributions,although non-conjugate priors can also be accommodated (Müller (1991)). Sincethe Gamma distributions are conjugate priors for a Poisson random variable,and the Dirichlet distributions form a conjugate family for the parameters of amultinomial random variable (see, for example, DeGroot (1970), Chapter 9), thepriors in the case that the X ij follow Poisson distributions could be given asandf(λ ik ) =f(π 1 ,...,π N ) =1Γ(a ik )b a ikikΓ(α 0 )∏ Nl=1Γ(α l )(λ a ik−1ikexp − λ )ik, i =1,...,M,k =1, 2 (4)b ikN∏i=1π α i−1i , (5)where α 0 = ∑ Ni=1 α i , α i > 0, i=1,...,N, and where the a ik ’s, b ik ’s, and α i ’sare chosen according to the available prior information.Implementation of the Gibbs sampler to find the marginal posterior distributionsrequires the specification of the full conditional distribution of the parameters,i.e., the conditional distribution of each parameter given the values of all ofthe other parameters. These are specified below, following standard proceduresfor conjugate analyses by DeGroot (1970). Note that the full conditional distributionof each parameter does not always depend on all of the other parameters,which leads to some further simplifications:(f(λ i1 |X, τ i ) ∼ Gamma a i1 +(f(λ i2 |X, τ i ) ∼ Gammaτ i∑j=1N∑x ij , (τ i + 1b i1) −1) (6)a i2 + x ij , (N − τ i + 1 ) −1) (7)b i2j=τ i+1=Pr{τ i = t|λ˜1,λ˜2,π˜,x){ ∏tj=1 }{(λ i1 ) xij exp(−λ i1 ) ∏Nj=t+1 }(λ i2 ) xij exp(−λ i2 )x ij !x ij !π t∑ { Nk=1 ∏kj=1 }{(λ i1 ) xij exp(−λ i1 ) ∏Nj=k+1 (8)(λ i2 ) xij exp(−λ i2 )x ij !x ij !}π kf(π˜|τ˜) ∼ Dirichlet(α˜′ ), (9)where α ′ k ,thekth element of α˜′ is given by α k + ∑ Mi=1 I {τi =k}, andwhereI {y} isthe indicator function for the set {y}.The Gibbs sampler algorithm proceeds as follows: Starting from arbitraryinitial values, a random sample is drawn from each full conditional distribution

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!