15.02.2013 Views

reverse engineering – recent advances and applications - OpenLibra

reverse engineering – recent advances and applications - OpenLibra

reverse engineering – recent advances and applications - OpenLibra

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

224<br />

Reverse Engineering <strong>–</strong> Recent Advances <strong>and</strong> Applications<br />

the activation level of a neuron into an output signal (regulation result). The induced local<br />

field <strong>and</strong> the output of the neuron, respectively, are given by:<br />

N<br />

i j � ik k j�1 � i j<br />

k�1<br />

v ( T ) � w x ( T ) b ( T )<br />

(1)<br />

x ( T ) � �(<br />

v ( T ))<br />

(2)<br />

i j i j<br />

where the synaptic weights wi1, wi2,…, wiN define the strength of connections between the ith<br />

neuron (e.g. ith gene) <strong>and</strong> its inputs (e.g. expression level of genes). Such synaptic weights<br />

exist between all pairs of neurons in the network. bi( Tj) denotes the bias for the ith neuron<br />

at time T j . We denote w � as a weight vector that consists of all the synaptic weights <strong>and</strong><br />

biases in the network. w � is adapted during the learning process to yield the desired network<br />

outputs. The activation function �() introduces nonlinearity to the model. When information<br />

about the complexity of the underlying system is available, a suitable activation function can<br />

be chosen (e.g. linear, logistic, sigmoid, threshold, hyperbolic tangent sigmoid or Gaussian<br />

function.) If no prior information is available, our algorithm uses the hyperbolic tangent<br />

sigmoid function.<br />

Fig. 3. Four NMs discovered in human: (A) multi-input module; (B) single input module; (C)<br />

feed-forward loop - 1; <strong>and</strong> (D) feed-forward loop - 2.<br />

As a cost function, we use the RMSE between the expected output <strong>and</strong> the network output<br />

across time <strong>and</strong> neurons in the network. The cost function is written as:

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!