16.01.2013 Views

Handbook of air conditioning and refrigeration / Shan K

Handbook of air conditioning and refrigeration / Shan K

Handbook of air conditioning and refrigeration / Shan K

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Artificial Neural Networks<br />

ENERGY MANAGEMENT AND CONTROL SYSTEMS 5.51<br />

● Diagnostics—assistance in identifying solutions to complex technical problems<br />

● Design—assistance in the selection <strong>of</strong> the HVAC&R systems <strong>and</strong> subsystems<br />

Basics. An artificial neural network (ANN) is massive interconnected, parallel processing,<br />

dynamic system <strong>of</strong> interacting processing elements that are in some aspect similar to the human<br />

brain. The fundamental processing element is called the neuron, which is analogous to the neural<br />

cell in human brain. The neurons are set in layers, <strong>and</strong> thus a network is formed as shown in<br />

Fig. 5.25. Inputs representing the variables that affect the output <strong>of</strong> the network are feeding forward<br />

to each <strong>of</strong> the neurons in the following layers with an activation depending on their weighted sum.<br />

Finally, an output can be calculated as a function <strong>of</strong> the weighted sum <strong>of</strong> the inputs <strong>and</strong> an additional<br />

factor, the biases.<br />

The ability to learn is one <strong>of</strong> the outst<strong>and</strong>ing characteristics <strong>of</strong> an ANN. The weights <strong>of</strong> the<br />

inputs are adjusted to produce a predicted output within specified errors. ANNs have been increasingly<br />

used in recent years to predict or to improve nonlinear system performance in HVAC&R. An<br />

ANN system is characterized by its net topology, neuron activations transfer, <strong>and</strong> learning method.<br />

Net Topology. The structure <strong>of</strong> the network <strong>of</strong> an ANN, or net topology, depends on the data flow<br />

mode, the number <strong>of</strong> layers, <strong>and</strong> the number <strong>of</strong> hidden neurons.<br />

● In Miller <strong>and</strong> Seem (1991), there are two types <strong>of</strong> data flow modes: state <strong>and</strong> feed-forward models.<br />

In the state models, all the neurons are connected to all the other neurons. In feed-forward<br />

models, the neurons are connected between layers, as shown in Fig. 5.25a, <strong>and</strong> the information<br />

flows from one layer to the next. Feed-forward models are the most popular <strong>and</strong> most <strong>of</strong>ten analyzed<br />

models.<br />

● In an ANN, there is always an input layer with the number <strong>of</strong> inputs equal to the number <strong>of</strong> parameters<br />

(variables) that affect the output.<br />

● There may be one or more hidden layers <strong>of</strong> neurons next to the input layer. The selection <strong>of</strong> number<br />

<strong>of</strong> hidden layers <strong>and</strong> the number <strong>of</strong> neurons in each hidden layer remains an art. Curtiss et al.<br />

(1996) noted that too many hidden layers <strong>and</strong> hidden neurons tend to memorize data rather than<br />

learning. The hidden layers <strong>and</strong> hidden neurons must be sufficient to meet the requirement during<br />

the learning process for more complex nonlinear systems. More hidden layers <strong>and</strong> hidden units<br />

need more calculations <strong>and</strong> become a burden.<br />

Among the 10 papers published from 1993 to 1996 in ASHRAE Transactions regarding developed<br />

ANNs, most have only one hidden layer, some have two layers, <strong>and</strong> only one has three hidden<br />

layers. None exceeds three hidden layers. Kawashima (1994) recommended that in an ANN only<br />

one hidden layer is sufficient for load prediction.<br />

● If the relationship between the inputs <strong>and</strong> output is more complex, i.e., nonlinear, <strong>and</strong> more inputs<br />

are involved, then more neurons are needed in each hidden layer. Kawashima (1994) also suggested<br />

that the number <strong>of</strong> neurons in each hidden layer exceed 2m � 1. Here m indicates the<br />

number <strong>of</strong> inputs.<br />

● There is always an output layer next to the hidden layer(s). It is preferable to have one neuron<br />

(single output) in the output layer for simplicity. There may be two or more neurons for multiple<br />

outputs.<br />

Neuron Activation Transfer. In Miller <strong>and</strong> Seem (1991) <strong>and</strong> Curtiss et al. (1996), for each neuron<br />

in the hidden <strong>and</strong> output layers:<br />

1. The input activations to a neuron in the first hidden layer h, denoted by i 1n, can be

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!