18.10.2014 Views

Delta-Bar-Delta and Extended Delta-Bar-Delta - Computer Science

Delta-Bar-Delta and Extended Delta-Bar-Delta - Computer Science

Delta-Bar-Delta and Extended Delta-Bar-Delta - Computer Science

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>Delta</strong>-<strong>Bar</strong>-<strong>Delta</strong><br />

<strong>and</strong><br />

<strong>Extended</strong> <strong>Delta</strong>-<strong>Bar</strong>-<strong>Delta</strong><br />

Michael Young<br />

Terrence Knox<br />

1


Brief Overview of Neural<br />

Networks<br />

- Inspired by biological neural networks<br />

- Neurons are attached to create layers<br />

- Layers are connected through a series of<br />

weights to create a network<br />

2


Back-Propagation Review<br />

- Would like to modify weights so our input is<br />

effected enough to get the desired output<br />

- Activation function for neuron has to be<br />

differentiable<br />

δs(x)/δx = s(x)(1 - s(x))<br />

- Propagate through network adjusting weights<br />

based on the error<br />

3


Back-Prop Algorithm<br />

4


Back-Prop Cont'd<br />

- May converge on local minimum<br />

- May take long to converge<br />

- Might not converge<br />

5


Introducing <strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong><br />

- Improved convergence rate for weights<br />

- Each weight has it's own learning rate<br />

- For each weight the gradient is computed <strong>and</strong><br />

compared to the previous gradient<br />

- Dynamic learning rate<br />

6


DBD Algorithm<br />

- Weight update<br />

w(t +1) = w(t) + α(t)δ(t)<br />

- α(t) learning rate applied to each weight<br />

- δ(t) output error<br />

- γ(t) error gradient<br />

- κ incrementing factor<br />

- φ decrementing factor<br />

7


DBD Learning Rate Update<br />

- Increments learning rates linearly<br />

- Prevents explosive learning rates<br />

- Decrements exponentially<br />

- Positive rates being decreased rapidly<br />

- Update weights normally using new learning<br />

rate<br />

8


How does DBD compare?<br />

- R. Jacobs empirical study<br />

- Steepest (Gradient) descent, Momentum,<br />

<strong>Delta</strong>-<strong>Bar</strong>-<strong>Delta</strong>, Hybrid (Momentum/DBD)<br />

- 25 Simulations were done<br />

on 4 tasks each<br />

- Weights were initialized<br />

(-0.5, 0.5)<br />

9


Comparison of Algorithms<br />

10


Simulation Results<br />

11


Why does DBD perform best?<br />

- Weight space has high<br />

curve<br />

- Length of the DBD<br />

gradient is shorter<br />

- New point remains in<br />

minimum<br />

- Similarly for flat curves<br />

12


<strong>Extended</strong> <strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong><br />

- Similar to st<strong>and</strong>ard <strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong>, but with<br />

the addition of a momentum coefficient<br />

- Momentum allows previous weight changes to<br />

influence future weight changes<br />

- Can help a network surpass a local minimum<br />

instead of being trapped, allowing for a better<br />

end result<br />

13


<strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong> Application<br />

Capacitor Banks Switching Overvoltages<br />

- Installation of shunt capacitor bank most<br />

practical/efficient way to supply dem<strong>and</strong>ed<br />

reactive power<br />

- Switching creates overvoltage that can reach<br />

phase-to-earth voltage values in the order of 2-<br />

3 parts per unit<br />

- ANN used to predict maximum peak<br />

overvoltage of CB switching in minimal<br />

computational time<br />

14


Capacitor Banks Switching<br />

Neural Network Configuration<br />

- Single hidden layer<br />

- Single output value<br />

(overvoltage peak)<br />

- <strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong>, <strong>Extended</strong><br />

<strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong>, Directed<br />

r<strong>and</strong>om search used to<br />

train<br />

- Hyperbolic tangent<br />

function used for activation<br />

function<br />

Above: Neural network structure<br />

15


<strong>Delta</strong> <strong>Bar</strong> <strong>Delta</strong> Application<br />

Input Parameters<br />

- Parameters:<br />

○<br />

○<br />

○<br />

○<br />

○<br />

○<br />

○<br />

Voltage at capacitor bus before switching<br />

Equivalent resistance of the circuit<br />

Equivalent inductance of the circuit<br />

Equivalent capacitance of the circuit<br />

Line length<br />

Closing time of the circuit breaker poles<br />

Capacitor bank capacity<br />

16


Capacitor Banks Switching<br />

Training Circuit<br />

Above: Sample system for CB study<br />

Above: Voltage at bus 2 after CB switching<br />

- System shown is the only circuit used to<br />

train the neural network<br />

- NN applies to all circuits by converting new<br />

system to equivalent system<br />

17


Capacitor Banks Switching<br />

Case 1<br />

Above: Training system<br />

Above: System used for case 1<br />

18


Capacitor Banks Switching<br />

Case 2<br />

Above: Training system<br />

Above: System used for case 2<br />

19


References<br />

[1] Iman Sadeghkhani, Abbas Ketabi, Rene Feuillet<br />

<strong>Delta</strong>-<strong>Bar</strong>-<strong>Delta</strong> <strong>and</strong> Directed R<strong>and</strong>om Search Algorithms to Study Capacitor Banks Switching<br />

Overvoltages<br />

Serbian Journal of Electrical Engineering, Vol 9, No. 2, June 2012<br />

[2] Jacobs, R. A.<br />

Increased rates of Convergence Through Learning Rate Adaptation.<br />

Technical Report COINS TR 87-117, University of Massachusetts at Amherst, Dept. of <strong>Computer</strong> <strong>and</strong><br />

Information <strong>Science</strong>, Amherst, MA, 1987.<br />

[3] S. Russell, Peter Norvig<br />

Artificial Intelligence: A Modern Approach, Third Edition<br />

20

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!