25.01.2015 Views

Download Full Issue in PDF - Academy Publisher

Download Full Issue in PDF - Academy Publisher

Download Full Issue in PDF - Academy Publisher

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

JOURNAL OF COMPUTERS, VOL. 8, NO. 6, JUNE 2013 1589<br />

If node k is not the node <strong>in</strong> output layer, connection<br />

weights effect on hidden node, then δ k<br />

can be computed<br />

by the follow<strong>in</strong>g formula:<br />

That is<br />

Thus<br />

∂e ∂e ∂Ok<br />

∂e δk<br />

= = = f ′( net<br />

k)<br />

∂net ∂O ∂net ∂O<br />

k k k k<br />

∂e<br />

(9)<br />

= ∑ δmw<br />

(10)<br />

km<br />

∂Ok<br />

δ = f ′( net ) ∑ δ w<br />

(11)<br />

k k m km<br />

m<br />

The formula shows that δ <strong>in</strong> low layer can be<br />

computed by δ <strong>in</strong> the upper layer.<br />

The learn<strong>in</strong>g process of BP network beg<strong>in</strong> from a set<br />

of random weights and thresholds, any selected samples<br />

can be <strong>in</strong>put. The output can be computed by forwardback<br />

method. Usually this error is big, the new weights<br />

and thresholds of the mode must be computed over aga<strong>in</strong><br />

by the back propagation. For all of the samples, the<br />

process should be done repeatedly aga<strong>in</strong> and aga<strong>in</strong>, to get<br />

the appo<strong>in</strong>ted accuracy. In the process of network<br />

operation, the system error and s<strong>in</strong>gle mode error can be<br />

followed. If the network learn<strong>in</strong>g successfully, the system<br />

errors will decrease with <strong>in</strong>creas<strong>in</strong>g of iterative time, at<br />

last converge at a set of steady weights and thresholds. [7]<br />

C. The Study Algorithm of Back Propagation Network<br />

In BP network model, the study algorithm of BP<br />

network can be described as the follow<strong>in</strong>g rules.<br />

Step 1 Initializ<strong>in</strong>g study parameters and BP network<br />

parameters. That is to set random numbers <strong>in</strong> [ − 1,1] for<br />

Neuron threshold and connection weights <strong>in</strong> hidden<br />

layers and output layers.<br />

Step 2 Propos<strong>in</strong>g the tra<strong>in</strong><strong>in</strong>g mode of BP network.<br />

That is to select a tra<strong>in</strong><strong>in</strong>g mode from the tra<strong>in</strong><strong>in</strong>g mode<br />

set, and put the <strong>in</strong>put mode and expected output mode to<br />

the BP network.<br />

Step 3 Forward propagation process. That is to<br />

compute the output mode of the network from the No.1<br />

hidden layer for the given <strong>in</strong>put layer. If error energiz<strong>in</strong>g,<br />

execut<strong>in</strong>g the step 4, else return<strong>in</strong>g to step 2, and<br />

provid<strong>in</strong>g next tra<strong>in</strong><strong>in</strong>g mode for the algorithm.<br />

Step 4 Back propagation process. That is to correct the<br />

connection weight of every unit <strong>in</strong> different layer from<br />

output layer to the first hidden layer, and follow<strong>in</strong>g the<br />

rules:<br />

1) Comput<strong>in</strong>g the error δ k<br />

of different units <strong>in</strong> the<br />

same layer.<br />

2) Correct<strong>in</strong>g the connection weights and threshold.<br />

For connection weights, the correct<strong>in</strong>g formula is:<br />

w ( t+ 1) = w ( t)<br />

+ ηδ O (12)<br />

jk jk k j<br />

For threshold, the correction method is same as the<br />

study method of connection weights.<br />

3) Repeat<strong>in</strong>g the Above-mentioned correct<strong>in</strong>g process<br />

to get expected output mode.<br />

Step 5 Turn back to step 2, and do<strong>in</strong>g step 2 to step 3<br />

for the every tra<strong>in</strong><strong>in</strong>g mode of tra<strong>in</strong><strong>in</strong>g mode set, until<br />

every tra<strong>in</strong><strong>in</strong>g mode meet the expected output.<br />

III. THE PRINCIPLE OF BACK PROPAGATION NEURAL<br />

NETWORK MODEL<br />

A. The Basic Pr<strong>in</strong>ciple of Fuzzy Mathematics<br />

Assumed that X represents a set of some objects,<br />

which is called co doma<strong>in</strong>. For a subset A <strong>in</strong> X , it can<br />

be expressed by its characteristic function, that is<br />

⎧1<br />

x ∈ A<br />

μA( x)<br />

= ⎨<br />

(13)<br />

⎩0<br />

x ∈ A<br />

In this, μ A<br />

is a function def<strong>in</strong>ed <strong>in</strong> X , its values<br />

belong to{ 0,1 } ,which is called characteristic function of<br />

A . For x ∈ X , if μ ( x A<br />

) = 1, thus, x is element of A .<br />

But if μ ( x A<br />

) = 0, thus, x isn’t the element of A . So we<br />

can def<strong>in</strong>e fuzzy sets:<br />

In co doma<strong>in</strong> X , for any element x ∈ X , if there is a<br />

formula correspond<strong>in</strong>g real function μ A( x ) :<br />

μA( x): X → [0,1]<br />

(14)<br />

X → μA( x)<br />

Then all elements x meet<strong>in</strong>g the formula assemble a<br />

set which is a fuzzy set A <strong>in</strong> set X . For x ∈ X , μ is<br />

A<br />

membership function of A . μ A ( x ) is called membership<br />

degree from x to A .[6]<br />

The Relationship that expresses uncerta<strong>in</strong> relationship<br />

us<strong>in</strong>g fuzzy Sets is def<strong>in</strong>ed fuzzy relation. [8] Fuzzy<br />

relation R between set X andY is fuzzy subset def<strong>in</strong>ed<br />

<strong>in</strong> X × Y , its membership function is shown as:<br />

μ<br />

R<br />

: X × Y → [0,1] (15)<br />

If X is same asY , so R is called the fuzzy relation <strong>in</strong> X .<br />

If the co doma<strong>in</strong> is product of n sets Xi<br />

( i = 1,2, , n)<br />

X × X × × X , its correspond<strong>in</strong>g fuzzy relationship<br />

1 2 n<br />

R is called n dimensions fuzzy relation.<br />

If X and Y are both limited subsets,<br />

X = { x , x , , x m<br />

} , Y = { y1, y2, , y n<br />

} , thus the<br />

then<br />

1 2<br />

fuzzy relation <strong>in</strong> X<br />

R<br />

× Y can be expressed by :<br />

⎡ μR( x1, y1) μR( x1, y2) μR( x1, yn)<br />

⎤<br />

⎢<br />

μ ( x , y ) μ ( x , y ) μ ( x , y )<br />

⎥<br />

⎢<br />

<br />

⎥<br />

<br />

<br />

⎢<br />

⎥<br />

⎣μR( xm, y1) μR( xm, y2) μR( xm, yn)<br />

⎦<br />

R 2 1 R 2 2 R 2 n<br />

= ⎢ ⎥<br />

(16)<br />

The above matrix is called fuzzy matrix, its<br />

element μ ( x , y ) <strong>in</strong> the scope of 0 between 1.<br />

R i i<br />

© 2013 ACADEMY PUBLISHER

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!