27.12.2012 Views

ARUP; ISBN: 978-0-9562121-5-3 - CMBBE 2012 - Cardiff University

ARUP; ISBN: 978-0-9562121-5-3 - CMBBE 2012 - Cardiff University

ARUP; ISBN: 978-0-9562121-5-3 - CMBBE 2012 - Cardiff University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Fig. 2. Hand characteristic parameters (left and middle) and experiment example (right).<br />

Table 3. Experiments carried out for each subject.<br />

Experiment ID<br />

E1 E2 E3 E4 E5 E6 E7 E8 E9 E10 E11 E12 E13 E14 E15 E16<br />

Bottle B1 B1 B1 B1 B2 B2 B2 B2 B3 B3 B3 B3 B4 B4 B4 B4<br />

Level L2 L2 L1 L1 L2 L2 L1 L1 L2 L2 L1 L1 L2 L2 L1 L1<br />

Task T2 T1 T2 T1 T2 T1 T2 T1 T2 T1 T2 T1 T2 T1 T2 T1<br />

3.2 Design of the Artificial Neural Network<br />

An artificial neural network is composed of a large number of simple processing<br />

elements, neurons, running in parallel. The system function depends on the network<br />

architecture (number of layers and neurons), the number of synaptic connections and the<br />

processing function for every neuron (activation function). The experimental knowledge<br />

is stored as the strength of the neuron connections (synaptic weights) by means of a<br />

learning process, similarly to what happens in the human brain. The method used to<br />

iteratively compute these weights as the network receives new information (inputs and<br />

desired outputs or targets) is called training algorithm. It is applied to train the network<br />

to perform some particular task. The optimum weights are obtained by an optimization<br />

technique, usually the calculation of the mean square error between the network outputs<br />

and the targets, for data not used in the training phase. This is called the test phase.<br />

There are two problems usually tackled by ANNs: pattern classification and functions<br />

approximation. Multiple-layer networks are quite powerful for functions approximation,<br />

which is the subject of this specific work. Also the connectivity plays an important role<br />

in successfully solving the problem (see [18] for detailed information). Accounting for<br />

the complex nature of our problem, we decided to use a multi-layer fully-connected<br />

feed-forward network, able to learn complex relationships. The selected training<br />

algorithm was the Backpropagation technique [18], which is the generalization of the<br />

Least Mean Squares algorithm to multiple-layer networks. It minimizes the mean square<br />

error and performs a supervised learning, where a set of examples of proper network<br />

behavior are provided for learning.<br />

We developed, trained and tested two different ANNs: 1) A 2-layer feed-forward<br />

network with a hidden layer (100 neurons); 2) a 3-layer feed-forward network with 2<br />

hidden layers (75 and 50 neurons). Over each ANN 3 different tests were performed:<br />

Test B2: The training set was composed of data from all the subjects and tasks, but<br />

from bottles B1, B3, and B4. The test set was composed of data from all subjects<br />

and tasks but only from bottle B2, that were not used for training.<br />

Test S4: The training set was composed of data from all the bottles and tasks, but

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!