13.07.2015 Views

View - Statistics - University of Washington

View - Statistics - University of Washington

View - Statistics - University of Washington

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

59ˆσ 2 ɛ = 1N 0 − 1∑(Y i − (Ĉ + ˆβ W +1 Y i−(W +1) + ˆβ W Y i−W + ˆβ W −1 Y i−(W −1) + ˆβ 1 Y i−1 )) 2i∈ ¯BSubstituting this into equation 4.57, we obtain(4.58)L(Y −B |M, B) = − N 02 log(2π) − N 02 log(ˆσ2 ɛ ) − N 0 − 12(4.59)Let L IND denote the loglikelihood as it would be computed with the assumptionthat all pixels are independent.As shown in equation 4.60, the independenceloglikelihood is quite similar to the dependence loglikelihood, except for the secondterm.L IND (Y −B |M, B) = − N 02 log(2π) − N 02 log(ˆσ2 Y ) − N 0 − 12(4.60)We now need to examine the relation between σ 2 ɛ and σ 2 Y so that an adjustmentterm relating equations 4.59 and 4.60 can be found (analogous to equation 4.21).We saw in equations 4.9 and 4.16 that a simple relation exists between these twoquantities for the AR(1) case. As mentioned in section 4.1.1, equation 4.16 is validfor any AR(P) model; in particular, it is valid for the RSA model. This meansthat with the RSA model we can correct the loglikelihood in the same way that itcan be corrected with the AR(1) model. I restate equation 4.16 as equation 4.61.ˆσ Y 2 = ˆσ ɛ 2 /(1 − R 2 ) (4.61)The R 2 value in equation 4.61 is from the RSA autoregression model, that is,the least squares regression with Y −B as the response vector and lagged values <strong>of</strong>Y (corresponding to the four adjacent neighbors preceding each Y i in raster scanorder) as the predictors. Combining equations 4.59, 4.60, and 4.61, we arrive atequation 4.62.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!