11.07.2015 Views

Causality in Time Series - ClopiNet

Causality in Time Series - ClopiNet

Causality in Time Series - ClopiNet

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Roebroeck Seth Valdes-Sosaand cortical columns. First, the fusion of multiple imag<strong>in</strong>g modalities, possibly simultaneouslyrecorded, has received a great deal of attention. Particularly, several attemptsto model-driven fusion of simultaneousy recorded fMRI and EEG data, by <strong>in</strong>vert<strong>in</strong>g aseparate observation model for each modality while us<strong>in</strong>g the same underly<strong>in</strong>g neuronalmodel, have been reported (Deneux and Faugeras, 2010; Riera et al., 2007; Valdes-Sosaet al., 2009). This approach holds great potential to fruitfully comb<strong>in</strong>e the superior spatialresolution of fMRI with the superior temporal resolution of EEG. In (Valdes-Sosaet al., 2009) anatomical connectivity <strong>in</strong>formation obta<strong>in</strong>ed from diffusion tensor imag<strong>in</strong>gand fiber tractography is also <strong>in</strong>corporated. Second, advances <strong>in</strong> MRI technology,particularly <strong>in</strong>creases of ma<strong>in</strong> field strength to 7T (and beyond) and advances <strong>in</strong> parallelimag<strong>in</strong>g (de Zwart et al., 2006; Heidemann et al., 2006; Pruessmann, 2004; Wies<strong>in</strong>geret al., 2006), greatly <strong>in</strong>crease the level spatial detail that are accessible with fMRI. For<strong>in</strong>stance, fMRI at 7T with sufficient spatial resolution to resolve orientation columns <strong>in</strong>human visual cortex has been reported (Yacoub et al., 2008).The development of state space models for causal analysis of fMRI data has movedfrom discrete to cont<strong>in</strong>uous and from determ<strong>in</strong>istic to stochastic models. Cont<strong>in</strong>uousmodels with stochastic dynamics have desirable properties, chief among them a robust<strong>in</strong>ference on causal <strong>in</strong>fluence <strong>in</strong>terpretable <strong>in</strong> the WAGS framework, as discussedabove. However, deal<strong>in</strong>g with cont<strong>in</strong>uous stochastic models leads to technical issuessuch as the properties and <strong>in</strong>terpretation of Wiener processes and Ito calculus (Friston,2008). A number of <strong>in</strong>version or filter<strong>in</strong>g methods for cont<strong>in</strong>uous stochastic modelshave been recently proposed, particularly for the goal of causal analysis of bra<strong>in</strong> imag<strong>in</strong>gdata, <strong>in</strong>clud<strong>in</strong>g the local l<strong>in</strong>earization and <strong>in</strong>novations approach (Hernandez et al.,1996; Riera et al., 2004), dynamic expectation maximization (Friston et al., 2008) andgeneralized filter<strong>in</strong>g (Friston et al., 2010). The ongo<strong>in</strong>g development of these filter<strong>in</strong>gmethods, their validation and their scalability towards large numbers of state variableswill be a topic of cont<strong>in</strong>u<strong>in</strong>g research.AcknowledgmentsThe authors thank Kamil Uludag for comments and discussion.ReferencesOdd O. Aalen. Dynamic model<strong>in</strong>g and causality. Scand<strong>in</strong>avian Actuarial journal,pages 177–190, 1987.O.O. Aalen and A. Frigessi. What can statistics contribute to a causal understand<strong>in</strong>g?Board of the Foundation of the Scand<strong>in</strong>avian journal of Statistics, 34:155–168, 2007.G. K. Aguirre, E. Zarahn, and M. D’Esposito. The variability of human, bold hemodynamicresponses. Neuroimage, 8(4):360–9, 1998.H Akaike. On the use of a l<strong>in</strong>ear model for the identification of feedback systems.Annals of the Institute of statistical mathematics, 20(1):425–439, 1968.96

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!