12.07.2015 Views

"Frontmatter". In: Analysis of Financial Time Series

"Frontmatter". In: Analysis of Financial Time Series

"Frontmatter". In: Analysis of Financial Time Series

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

MISSING VALUES AND OUTLIERS 413Consider, for instance, the case that x h and x h+1 are missing. These missing valuesare related to {x h−p ,...,x h−1 ; x h+2 ,...,x h+p+1 }. We can define a dependent variabley h+ j in a similar manner as before to set up a multiple linear regression withparameters x h and x h+1 . The least squares method is then used to obtain estimates<strong>of</strong> x h and x h+1 . Combining with the specified prior distributions, we have a bivariatenormal posterior distribution for (x h , x h+1 ) ′ . <strong>In</strong> Gibbs sampling, this approachdraws the consecutive missing values jointly. Second, we can apply the result <strong>of</strong> asingle missing value in Eq. (10.17) multiple times within a Gibbs iteration. Againconsider the case <strong>of</strong> missing x h and x h+1 . We can employ the conditional posteriordistributions f (x h | X, x h+1 , φ,σ 2 ) and f (x h+1 | X, x h , φ,σ 2 ) separately. <strong>In</strong> Gibbssampling, this means that we draw the missing value one at a time.Because x h and x h+1 are correlated in a time series drawing them jointly is preferredin a Gibbs sampling. This is particularly so if the number <strong>of</strong> consecutive missingvalues is large. Drawing one missing value at a time works well if the number <strong>of</strong>missing values is small.Remark: <strong>In</strong> the previous discussion, we assume h − p ≥ 1andh + p ≤ n. Ifh is close to the end points <strong>of</strong> the sample period, the number <strong>of</strong> data points availablein the linear regression model must be adjusted.10.6.2 Outlier DetectionDetection <strong>of</strong> additive outliers in Eq. (10.14) becomes straightforward under theMCMC framework. Except for the case <strong>of</strong> a patch <strong>of</strong> additive outliers with similarmagnitudes, the simple Gibbs sampler <strong>of</strong> McCulloch and Tsay (1994) seems to workwell; see Justel, Peña, and Tsay (2001). Again we use an AR model to illustratethe problem. The method applies equally well to other time series models whenthe Metropolis–Hasting algorithm, or the Griddy Gibbs is used to draw values <strong>of</strong>nonlinear parameters.Assume that the observed time series is y t , which may contain some additiveoutliers whose locations and magnitudes are unknown. We write the model for y t asy t = δ t β t + x t , t = 1,...,n, (10.18)where {δ t } is a sequence <strong>of</strong> independent Bernoulli random variables such that P(δ t =1) = ɛ and P(δ t = 0) = 1 − ɛ, ɛ is a constant between 0 and 1, {β t } is a sequence<strong>of</strong> independent random variables from a given distribution, and x t is an outlier-freeAR(p) time series,x t = φ 0 + φ 1 x t−1 +···+φ p x t−p + a t ,where {a t } is a Gaussian white noise with mean zero and variance σ 2 . This modelseems complicated, but it allows additive outliers to occur at every time point. Thechance <strong>of</strong> being an outlier for each observation is ɛ.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!