Download Full Issue in PDF - Academy Publisher
Download Full Issue in PDF - Academy Publisher
Download Full Issue in PDF - Academy Publisher
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
1468 JOURNAL OF COMPUTERS, VOL. 8, NO. 6, JUNE 2013<br />
⎡t11 t12 t1 k ⎤ ⎡x11 x12 x1 n ⎤ ⎡p11 p12 p1k<br />
⎤<br />
⎢<br />
t21 t22 t<br />
⎥ ⎢<br />
2k x21 x22 x<br />
⎥ ⎢<br />
2n p21 p22 p<br />
⎥<br />
⎢<br />
<br />
⎥ ⎢<br />
<br />
⎥ ⎢<br />
<br />
2k<br />
= ⋅<br />
⎥<br />
⎢ ⎥ ⎢ ⎥ ⎢ ⎥<br />
⎢ ⎥ ⎢ ⎥ ⎢ ⎥<br />
t t t x x x p p p<br />
⎣ m1 m2 mk⎦ ⎣ m1 m2 mn⎦ ⎣ n1 n2<br />
nk⎦<br />
Select<strong>in</strong>g a reduced subset of PC space results <strong>in</strong> a<br />
reduced dimension structure with respect to the important<br />
<strong>in</strong>formation available as shown <strong>in</strong> the follow<strong>in</strong>g<br />
expression:<br />
[ t t t ] [ x x x ]<br />
⎡ p11 p12 p1<br />
k ⎤<br />
⎢<br />
p p p<br />
⎥<br />
⎢<br />
<br />
⎥<br />
<br />
⎢<br />
⎥<br />
⎣ pn 1<br />
pn2<br />
pnk⎦<br />
21 22 2k<br />
1 2 k<br />
=<br />
1 2<br />
n<br />
⋅ ⎢ ⎥<br />
D. Artificial Neural Network<br />
ANN is a computer model whose architecture<br />
essentially mimics the knowledge acquisition and<br />
organizational skills of the human bra<strong>in</strong>. Although there<br />
are a variety of ways to construct these models, Back-<br />
Propagated (BP) neural network has become one of the<br />
most widely used ANNs <strong>in</strong> practice. BP neural network<br />
with a s<strong>in</strong>gle hidden layer is selected <strong>in</strong> this paper, which<br />
has been demonstrated to be sufficient to approximate<br />
any cont<strong>in</strong>uous function with<strong>in</strong> the desired accuracy [20].<br />
Figure 10 shows a diagram of neural network with a<br />
s<strong>in</strong>gle hidden layer.<br />
x 1<br />
(2)<br />
(3)<br />
The goal of the tra<strong>in</strong><strong>in</strong>g of ANN is to m<strong>in</strong>imize the<br />
error between predicted and target values by adjust<strong>in</strong>g the<br />
connection weights and biased. The error is given by<br />
Equation (6):<br />
p q<br />
2<br />
E = ∑∑ ( apq<br />
−opq<br />
)<br />
(6)<br />
p= 1 q=<br />
1<br />
Where q is the number of logic units <strong>in</strong> output layer, and<br />
p is the number of tra<strong>in</strong><strong>in</strong>g samples, a pq<br />
and o<br />
pq<br />
are<br />
the predicted and target values, respectively.<br />
E. Multi-layer Neural Network<br />
A new method named as multi-layer neural network is<br />
proposed to diagnose all open-circuit fault modes under<br />
consideration for the NPC <strong>in</strong>verter, as shown <strong>in</strong> Figure 11.<br />
Feature A<br />
Ma<strong>in</strong><br />
Feature<br />
Feature B<br />
Ma<strong>in</strong><br />
ANN<br />
Output<br />
Auxiliary<br />
ANN A<br />
S or { S , S }<br />
a a a<br />
2 1 2<br />
S or { S , S }<br />
a a a<br />
3 3 4<br />
Auxiliary<br />
ANN B<br />
Figure 11. Multi-layer neural network<br />
Output<br />
Output<br />
y 1<br />
x 2<br />
x 3<br />
<br />
x n<br />
<br />
n h q<br />
<br />
<br />
y 2<br />
<br />
y q<br />
Figure 10. Neural network with a s<strong>in</strong>gle hidden layer<br />
The three layers are called the <strong>in</strong>put layer, hidden layer<br />
and output layer, respectively. Each layer consists of<br />
logic units or neurons, as the basic <strong>in</strong>formation<br />
process<strong>in</strong>g units <strong>in</strong> ANN. The relationship of the <strong>in</strong>put<br />
value of the unit i <strong>in</strong> <strong>in</strong>put layer and that of unit j <strong>in</strong><br />
hidden layer is:<br />
n<br />
uj = ∑ ω<br />
ji<br />
xi + bj<br />
(4)<br />
i=<br />
1<br />
Where x<br />
i<br />
is an <strong>in</strong>put value of the logic unit i <strong>in</strong> the <strong>in</strong>put<br />
layer, u<br />
j<br />
an <strong>in</strong>itial output value of the logic unit j <strong>in</strong> the<br />
hidden layer, ω<br />
ji<br />
connection weights between unit j and<br />
i , b<br />
j<br />
<strong>in</strong>put bias of the unit j , n the number of logic<br />
units <strong>in</strong> the <strong>in</strong>put layer.<br />
The <strong>in</strong>itial output value u<br />
j<br />
is further transformed with<br />
the common transfer function <strong>in</strong> a sigmoid form:<br />
1<br />
= (5)<br />
+<br />
O j u j<br />
1 e −<br />
Where O is the f<strong>in</strong>al output value of the logic unit j .<br />
j<br />
TABLE I.<br />
FAULT MODES AND OUTPUT OF MAIN ANN<br />
Fault modes (open-circuit)<br />
Target output<br />
Fault free 000000<br />
S<br />
a1<br />
100000<br />
S<br />
a2<br />
or { S<br />
a1<br />
, S<br />
a2<br />
} 010000<br />
S<br />
a3<br />
or { S<br />
a3<br />
, S<br />
a4<br />
} 001000<br />
S<br />
a4<br />
000100<br />
D<br />
a5<br />
000010<br />
D<br />
a6<br />
000001<br />
{ S<br />
a1<br />
, S<br />
a3<br />
} 101000<br />
{ S<br />
a1<br />
, S<br />
a4<br />
} 100100<br />
{ S<br />
a2<br />
, S<br />
a3<br />
} 011000<br />
{ S<br />
a2<br />
, S<br />
a4<br />
} 010100<br />
TABLE II.<br />
FAULT MODES AND OUTPUT OF AUXILIARY ANN A<br />
Fault modes (open-circuit)<br />
Target output<br />
S<br />
a2<br />
0<br />
{ S<br />
a1<br />
, S<br />
a2<br />
} 1<br />
Ma<strong>in</strong> Feature extracted from the bridge voltage V ao<br />
is<br />
used as <strong>in</strong>put data for ma<strong>in</strong> ANN, which is used to<br />
diagnose eleven fault modes represented <strong>in</strong> Table I<br />
(<strong>in</strong>clud<strong>in</strong>g fault free mode). While Feature A and Feature<br />
B extracted from upper bridge voltage V auo<br />
and down<br />
bridge voltage V ado<br />
are used as the <strong>in</strong>put data for<br />
auxiliary ANN A and B respectively. Table II and Table<br />
© 2013 ACADEMY PUBLISHER