25.12.2013 Aufrufe

Nichtlineare Methoden zur Quantifizierung von Abhängigkeiten und ...

Nichtlineare Methoden zur Quantifizierung von Abhängigkeiten und ...

Nichtlineare Methoden zur Quantifizierung von Abhängigkeiten und ...

MEHR ANZEIGEN
WENIGER ANZEIGEN

Erfolgreiche ePaper selbst erstellen

Machen Sie aus Ihren PDF Publikationen ein blätterbares Flipbook mit unserer einzigartigen Google optimierten e-Paper Software.

i<br />

Abstract<br />

In order to determine the relation between two stochastic processes information<br />

theory offers an appropriate framework in which of the relationships can be interpreted<br />

in terms of information. The dependence can be measured with the mutual<br />

information, giving the amount of information which both processes share, i.e.<br />

the degree of similarities. Mutual information can also give hints for the coupling<br />

direction, however, due to serial correlation in time the results might be misleading.<br />

Additionally, only non-coupled systems can be distinguished from coupled<br />

systems. To determine the coupling directions, the dynamics of the processes have<br />

to be taken into account which leads to the transfer entropy. By considering the<br />

past, transfer entropy measures the direct impact which the driving process has<br />

on the future state of the driven process by excluding any influence due to the<br />

serial correlations. Based on information theory, coupling strength is quantified<br />

as the amount of effective information transmission from one process to the other.<br />

Thus, transfer entropy allows to distinguish between unidirectional coupling and<br />

bidirectional coupling.<br />

While values for mutual information and transfer entropy can be easily archived<br />

for processes with discrete state space, their estimations from finite data<br />

sets are difficult. Partitioning the state space, mutual information and transfer<br />

entropy of the discretised processes converge to the corresponding values of the<br />

continuous processes if the partitions are refined. Furthermore, mutual information<br />

shows monotonically increasing convergence and thus can be used to reject the<br />

assumption of both processes being independent. For transfer entropy no similar<br />

monotonic convergence seems to hold. Kernel estimators represent an alternative<br />

approach in order to estimate information theoretical quantities. They are easy<br />

to implement and the bias of the estimators due to serial correlations in the data<br />

sets can be suppressed easily too.<br />

A special class of stochastic processes are point processes. Here, the discrete<br />

times at which an event occurs are of interest. Again mutual information can<br />

be used to quantify dependence between two point processes but this time the<br />

increments, i.e. the number of events within a certain time interval, have to be<br />

considered. A weaker measure for dependence is the covariance of the increments<br />

leading in a special case to the number of coincidences. Considering increments,<br />

coupling directions can be determined with the transfer entropy as well. Unfortunately,<br />

due to the large bias in the estimators the exact values of the information<br />

transmissions cannot reliablely be given. When using increments, the time scale<br />

on which dependence is detected is given by the length of the time intervals. As<br />

an alternative method inter-event intervals and cross-event intervals are introduced<br />

which are ordered to one discrete time index congruently. By calculating<br />

the mutual information between these event intervals dependence between point<br />

processes is detectable without choosing a certain time scale.

Hurra! Ihre Datei wurde hochgeladen und ist bereit für die Veröffentlichung.

Erfolgreich gespeichert!

Leider ist etwas schief gelaufen!