25.01.2015 Views

Download Full Issue in PDF - Academy Publisher

Download Full Issue in PDF - Academy Publisher

Download Full Issue in PDF - Academy Publisher

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

JOURNAL OF COMPUTERS, VOL. 8, NO. 6, JUNE 2013 1483<br />

B. Traffic Flow Volterra Neural Network Model<br />

Analysis and comparison of traffic flow Volterra series<br />

model <strong>in</strong> equation (6) and three-layer BP neural network<br />

<strong>in</strong> equation (12), if the <strong>in</strong>put vector <strong>in</strong> equation (12) <strong>in</strong><br />

VNNTF to take the traffic flow chaotic time series, then<br />

between them <strong>in</strong> the function, structure and method for<br />

solv<strong>in</strong>g are <strong>in</strong>herently close contact and similarity.<br />

1) From a functional po<strong>in</strong>t of view, the traffic flow<br />

chaotic time series, Volterra series model and ANN<br />

model can be measured traffic flow chaotic time series, to<br />

simulate and predict the traffic flow process. Traffic flow<br />

chaotic time series Volterra model can determ<strong>in</strong>e the<br />

model truncation order of the truncation by the<br />

characteristics analysis of the traffic flow time series.<br />

Then, it can use the system identification to strike a<br />

nuclear function of the Volterra series model, or proper<br />

orthogonal decomposition method, stepwise multiple<br />

regression method, iterative decl<strong>in</strong>e <strong>in</strong> the gradient<br />

method, Volterra filter and constra<strong>in</strong>ts orthogonal<br />

approximation method to solve the nuclear function or<br />

Volterra series, which reflect the chaotic nonl<strong>in</strong>ear law of<br />

the traffic flow.<br />

2) From a structural po<strong>in</strong>t of view, the traffic flow<br />

chaotic time series Volterra model and ANN model is<br />

also isomorphic. Length of the storage memory of past<br />

traffic flow relative to chaotic time series <strong>in</strong> the traffic<br />

flow Volterra model, that is, the m<strong>in</strong>imum embedd<strong>in</strong>g<br />

dimension <strong>in</strong> phase space reconstruction of is equivalent<br />

to the number of neurons of the ANN model <strong>in</strong>put layer.<br />

3) From a method for solv<strong>in</strong>g po<strong>in</strong>t of view, Traffic<br />

flow chaotic time series Volterra model is based on<br />

orthogonal polynomials for the numerical approximation<br />

to f<strong>in</strong>d the approximate solution.the Meixner function<br />

systems and network weights have the same effect.<br />

x()<br />

t<br />

xt ( + τ )<br />

xt ( + ( m−1) τ )<br />

w N ,1<br />

w 1,m<br />

w 1,0<br />

w2,0<br />

wN ,0<br />

w 1,1<br />

w 2,1<br />

w 2,m<br />

w N , m<br />

V () t 1<br />

V () t yt <br />

g ()<br />

2 2<br />

V () N<br />

t<br />

Input Hidden layer Output<br />

Figure 2. The chaotic time series Volterra neural network traffic flow<br />

model (VNNTF)<br />

Through consistency of traffic flow chaotic time series<br />

Volterra model and ANN model, <strong>in</strong> this paper, the traffic<br />

flow chaotic time series Volterra neural network model<br />

(VNNTF) has been proposed <strong>in</strong> Figure 2. In the figure,<br />

X() t = ( xt (), xt ( + τ ), , xt ( + ( m−1) τ ) T ( t = 1, 2, ) is the<br />

traffic flow chaotic time series reconstructed phase space<br />

vector; w<br />

i,<br />

j<br />

( i = 1, 2, ; j = 1, 2, ), r n<br />

is the traffic flow<br />

chaotic time series Volterra neural network weights<br />

parameters; g , ( s = 1, 2, , N ) is the activation function<br />

s<br />

and Vs<br />

( k ) is the traffic flow of the convolution of the<br />

<strong>in</strong>put signal:<br />

g 1<br />

g N<br />

r 2<br />

r 1<br />

r N<br />

m<br />

V () t = w x( t+ ( i−1) τ )<br />

N<br />

∑ Ni<br />

(13)<br />

i=<br />

0<br />

Thus, the traffic flow chaotic time series Volterra<br />

neural network expression is<br />

N<br />

<br />

<br />

y( t) = f( X( t)) = f( x( t)) =∑ rsgs( VN( t))<br />

N<br />

m<br />

s s si<br />

s= 1 i=<br />

0<br />

s=<br />

1<br />

∑ ∑ (14)<br />

= rg ( w xt ( + ( i−1) τ ))<br />

IV. TRAFFIC FLOW VOLTERRA NEURAL NETWORK RAPID<br />

LEARNING ALGORITHM<br />

A. Activation Function Analysis of Traffic Tlow Volterra<br />

Neural Network<br />

Activation function of hidden layer to the VNNTF<br />

model designed for the follow<strong>in</strong>g polynomial function:<br />

g = a + a x+ a x + + a x + (15)<br />

2<br />

i<br />

s 0, s 1, s 2, s i,<br />

s<br />

where ais<br />

,<br />

∈ R is the polynomial coefficients, and then<br />

So, to get:<br />

N<br />

N +∞<br />

<br />

i<br />

y( t) = r g ( V ( t)) = ra ( V ( t))<br />

∑<br />

∑∑<br />

s s N s i,<br />

s N<br />

s= 1 s= 1 i=<br />

1<br />

N<br />

+∞<br />

m<br />

∑∑ ∑<br />

i<br />

= ra ( w xt ( + ( i−1) τ ))<br />

s i,<br />

s si<br />

s= 1 i= 1 i=<br />

0<br />

N<br />

h ( l , l , l ) = ∑ ra w w w<br />

j 1 2 j s js , sl , 1 sl , 2 sl , j<br />

s=<br />

1<br />

( j = 1, 2, , m) (16)<br />

In the VNNTF model, the sigmoid function or other<br />

functions as the activation function gs( Vs( t )) tra<strong>in</strong><strong>in</strong>g<br />

VNNTF network, the weights and thresholds are obta<strong>in</strong>ed,<br />

the activation function gs( Vs( t )) is expanded <strong>in</strong>to a<br />

Taylor series, you can obta<strong>in</strong> the polynomial coefficients:<br />

a<br />

js .<br />

( j)<br />

gs<br />

( θs<br />

)<br />

= (17)<br />

j!<br />

Among them, ( j<br />

g ) ( θ ) is the j -order derivative of<br />

s<br />

s<br />

function gs( Vs( t ) <strong>in</strong> θ s<br />

; that is a different activation<br />

function, you can get a<br />

js .<br />

. VNNTF network learn<strong>in</strong>g and<br />

tra<strong>in</strong><strong>in</strong>g, accord<strong>in</strong>g to the connection weights of the<br />

network of neurons and the coefficients of a<br />

js .<br />

, you can<br />

solve any order kernel function, which would address the<br />

difficulties of solv<strong>in</strong>g high-level nuclear function <strong>in</strong> the<br />

Volterra model. In general, if directly us<strong>in</strong>g the<br />

polynomial function for the activation function, the<br />

polynomial order is taken as m , the same Taylor<br />

expansion of the Taylor series, the order is taken to the<br />

m , so VNNTF model by sett<strong>in</strong>g different order of the<br />

activation functions to reflect the effect equivalent to the<br />

Volterra model higher order kernel function.<br />

© 2013 ACADEMY PUBLISHER

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!