17.03.2015 Views

DIFFERENtIAl & DIFFERENCE EqUAtIONS ANd APPlICAtIONS

DIFFERENtIAl & DIFFERENCE EqUAtIONS ANd APPlICAtIONS

DIFFERENtIAl & DIFFERENCE EqUAtIONS ANd APPlICAtIONS

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

BAYESIAN FORECASTING IN UNIVARIATE<br />

AUTOREGRESSIVE MODELS WITH NORMAL-GAMMA<br />

PRIOR DISTRIBUTION OF UNKNOWN PARAMETERS<br />

IGOR VLADIMIROV AND BEVAN THOMPSON<br />

We consider the problem of computing the mean-square optimal Bayesian predictor in<br />

univariate autoregressive models with Gaussian innovations. The unknown coefficients<br />

of the model are ascribed a normal-gamma prior distribution providing a family of conjugate<br />

priors. The problem is reduced to calculating the state-space realization matrices of<br />

an iterated linear discrete time-invariant system. The system theoretic solution employs a<br />

scalarization technique for computing the power moments of Gaussian random matrices<br />

developed recently by the authors.<br />

Copyright © 2006 I. Vladimirov and B. Thompson. This is an open access article distributed<br />

under the Creative Commons Attribution License, which permits unrestricted use,<br />

distribution, and reproduction in any medium, provided the original work is properly<br />

cited.<br />

1. Introduction<br />

We consider the problem of computing the mean-square optimal Bayesian predictor for<br />

a univariate time series whose dynamics is described by an autoregressive model with<br />

Gaussian innovations [3]. The coefficients of the model are assumed unknown and ascribed<br />

a normal-gamma prior distribution [1, page 140] providing a family of conjugate<br />

priors.<br />

The problem reduces to computing the power moments of a square random matrix<br />

which is expressed affinely in terms of a random vector with multivariate Student distribution<br />

[1, page 139]. The latter is a randomized mixture of Gaussian distributions,<br />

thereby allowing us to employ a matrix product scalarization technique developed in<br />

[7] for computing the power moments EX s of Gaussian random matrices X = A + BζC,<br />

where ζ is a standard normal random vector and A, B, C are appropriately dimensioned<br />

constant matrices.<br />

Note that developing an exact algorithm for the power moment problem, alternative<br />

to an approximate solution via Monte Carlo simulation, is complicated by the noncommutativity<br />

of the matrix algebra that is only surmountable in special classes of random<br />

matrices, of which a more general one is treated by sophisticated graph theoretic methods<br />

in [8].<br />

Hindawi Publishing Corporation<br />

Proceedings of the Conference on Differential & Difference Equations and Applications, pp. 1099–1108

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!