28.02.2014 Views

The Development of Neural Network Based System Identification ...

The Development of Neural Network Based System Identification ...

The Development of Neural Network Based System Identification ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

4.2 THE ARTIFICIAL NEURAL NETWORKS 87<br />

memories <strong>of</strong> the past activation <strong>of</strong> the hidden units. <strong>The</strong> Elman network <strong>of</strong>fers benefits<br />

over NNARX network in such a way that the regression structure and memory dynamics<br />

are determined by the network itself [Samarasinghe, 2007]. Thus, it is not necessary to<br />

provide the time lagged inputs to the network as in NNARX network.<br />

Figure 4.6(a) shows the basic Elman network which has similarity to typical feedforward<br />

MLP network in term <strong>of</strong> input, hidden and output layers. <strong>The</strong> external inputs,<br />

X j from the measurements are fed to the input layer and propagated forward to the<br />

hidden neurons. <strong>The</strong> outputs <strong>of</strong> from the hidden neurons, v h are then sent forward to<br />

the output layers to produce the network output predictions, y i at the next time step<br />

t + 1. <strong>The</strong> Elman network also consists <strong>of</strong> extra processing units called context units.<br />

<strong>The</strong> context units receive and store the output signals from the hidden neurons and with<br />

a one step delay [Pham and Liu, 1993]. At the next time step, the outputs <strong>of</strong> context<br />

units, x k would send the delayed hidden neuron outputs to the hidden neurons. Note<br />

that solid lines indicate weights that are allows to adapt according to an optimisation<br />

routine, while dashed lines denotes recurrent connections from hidden neurons to context<br />

units which are not modifiable and usually their strengths are fixed as 1.<br />

Findings from Pham and Liu [1993] have suggested that the basic Elman network<br />

was found unable to identify higher order linear or non-linear dynamic system due to<br />

insufficient memory in the network. Pham and Liu [1996] proposed a modified version<br />

<strong>of</strong> Elman network to increase the memory capabilities <strong>of</strong> the basic Elman network<br />

though the use <strong>of</strong> self-connections in the context units. Figure 4.6(b) shows that the<br />

modified version <strong>of</strong> basic Elman network with self-connections in the context units<br />

indicate by scalar value α. <strong>The</strong> gradient calculation <strong>of</strong> the modified Elman network has<br />

similarity in structure to gradient calculation from dynamic back-propagation (DBP)<br />

or back-propagation through time (BPTT) algorithm [Pham and Liu, 1996]. This<br />

algorithm provides a dynamic trace <strong>of</strong> gradients in parameter space and would enable<br />

the network to model dynamics system <strong>of</strong> higher order.<br />

In this work, the modified version <strong>of</strong> the basic Elman network is used for system<br />

identification <strong>of</strong> helicopter dynamics system. Further modification has been made to the<br />

modified Elman network proposed in Pham and Liu [1993] by reducing the number <strong>of</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!