12.07.2015 Views

"Frontmatter". In: Analysis of Financial Time Series

"Frontmatter". In: Analysis of Financial Time Series

"Frontmatter". In: Analysis of Financial Time Series

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

146 NONLINEAR TIME SERIESdom variables with mean zero and variance σ 2 v ,and{u t} is independent <strong>of</strong> {v t }.Monte Carlo techniques are employed to handle the nonlinear evolution <strong>of</strong> the statetransition equation because the whole conditional distribution function <strong>of</strong> S t givenS t−1 is needed for a nonlinear system. Other numerical smoothing methods for nonlineartime series analysis have been considered by Kitagawa (1998) and the referencestherein. MCMC methods (or computing-intensive numerical methods) arepowerful tools for nonlinear time series analysis. Their potential has not been fullyexplored. However, the assumption <strong>of</strong> knowing f t (.) and g t (.) in model (4.27) mayhinder practical use <strong>of</strong> the proposed method. A possible solution to overcome thislimitation is to use nonparametric methods such as the analyses considered in FARand NAAR models to specify f t (.) and g t (.) before using nonlinear state-space models.4.1.9 Neural NetworksA popular topic in modern data analysis is neural network, which can be classifiedas a semiparametric method. The literature on neural network is enormous, and itsapplication spreads over many scientific areas with varying degrees <strong>of</strong> success; seesection 2 <strong>of</strong> Ripley (1993) for a list <strong>of</strong> applications and section 10 for remarks concerningits application in finance. Cheng and Titterington (1994) provide informationon neural networks from a statistical viewpoint. <strong>In</strong> this subsection, we focus solelyon the feed-forward neural networks in which inputs are connected to one or moreneurons, ornodes, in the input layer, and these nodes are connected forward to furtherlayers until they reach the output layer. Figure 4.5 shows an example <strong>of</strong> a simplefeed-forward network for univariate time series analysis with one hidden layer. Theinput layer has two nodes, and the hidden layer has three. The input nodes are connectedforward to each and every node in the hidden layer, and these hidden nodesare connected to the single node in the output layer. We call the network a 2-3-1 feedforwardnetwork. More complicated neural networks, including those with feedbackconnections, have been proposed in the literature, but the feed-forward networks aremost relevant to our study.IOUTPUTNPUTHidden LayerFigure 4.5. A feed-forward neural network with one hidden layer for univariate time seriesanalysis.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!