17.04.2015 Views

DARPA ULTRALOG Final Report - Industrial and Manufacturing ...

DARPA ULTRALOG Final Report - Industrial and Manufacturing ...

DARPA ULTRALOG Final Report - Industrial and Manufacturing ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

focusing on the computational complexity of individual orbits. By not requiring a Hamiltonian,<br />

computational mechanics can be applied in a wide range of contexts, including those where an<br />

energy function for the system may not manifest like for the supply chains. Notions of<br />

Complexity, Emergence <strong>and</strong> Self-Organization have also been formalized <strong>and</strong> quantified in terms<br />

of various information measures (Shalizi 2000).<br />

Given a time series, the (unknowable) exact states of an observed system are translated into<br />

sequence of symbols via a measurement channel (Crutchfield 1992). Two histories (i.e., two<br />

series of past data) carry equivalent information if they lead to the same (conditional) probability<br />

distribution in the future (i.e., if it makes no difference whether one or the other data-series is<br />

observed). Under these circumstances, i.e., the effects of the two series being indistinguishable,<br />

they can be lumped together. This procedure identifies causal states, <strong>and</strong> also identifies the<br />

structure of connections or succession in causal states, <strong>and</strong> creates what is known as an “epsilonmachine”.<br />

The ε -machines form a special class of Deterministic Finite State Automata (DFSA)<br />

with transitions labeled with conditional probabilities <strong>and</strong> hence can also be viewed as Markov<br />

chains. However, finite-memory machines likeε -machines may fail to admit a finite size model<br />

implying that the number of casual states could turn out to be infinite. In this case, a more<br />

powerful model than DFSA needs to be used. One proceeds by trying to use the next most<br />

powerful model in the hierarchy of machines known as the casual hierarchy (Crutchfield 1994),<br />

in analogy with the Chomsky hierarchy of formal languages. While “ε -machine reconstruction”<br />

refers to the process of constructing the machine given an assumed model class, “hierarchical<br />

machine reconstruction” describes a process of innovation to create a new model class. It detects<br />

regularities in a series of increasingly accurate models. The inductive jump to a higher<br />

computational level occurs by taking those regularities as the new representation.<br />

ε -machines reflect a balanced utilization of deterministic <strong>and</strong> r<strong>and</strong>om information processing<br />

<strong>and</strong> this is discovered automatically during ε -machine reconstruction. These machines are<br />

unique <strong>and</strong> optimal in the sense that they have maximal predictive power <strong>and</strong> minimum model<br />

size (hence satisfy Principle of Occam Razor i.e. causes should not be multiplied beyond<br />

necessity). ε -machine provides a minimal description of the pattern or regularities in a system in<br />

the sense that the pattern is the algebraic structure determined by the causal states <strong>and</strong> their<br />

transitions. ε -machines are also minimally stochastic. Hence computational mechanics acts as a<br />

method for automatic pattern discovery.<br />

ε -machine is the organization of the process, or at least of the part of it which is relevant to<br />

our measurements. The ε -machine being a model of the observed time series from a system can<br />

be used to define <strong>and</strong> calculate macroscopic or global properties that reflect the characteristic<br />

average information processing capabilities of the system. Some of these include Entropy rate,<br />

Excess entropy <strong>and</strong> Statistical Complexity (Feldman <strong>and</strong> Crutchfield 1998) <strong>and</strong> (Crutchfield <strong>and</strong><br />

Feldman 2001). The entropy density indicates how predictable the system is. Excess entropy on<br />

other h<strong>and</strong> provides a measure of the apparent memory stored in a spatial configuration <strong>and</strong><br />

represents how hard it is the prediction.ε -machine reconstruction leads to a natural measure of<br />

the statistical complexity of a process, namely the amount of information needed to specify the<br />

state of the ε -machine i.e. the Shannon Entropy. Statistical Complexity is distinct <strong>and</strong> dual from<br />

information theoretic entropies <strong>and</strong> dimension (Crutchfield <strong>and</strong> Young 1989). The existence of<br />

chaos shows that there is rich variety of unpredictability that spans the two extremes: periodic <strong>and</strong><br />

r<strong>and</strong>om behavior. This behavior between two extremes while of intermediate information content<br />

is more complex in that the most concise description (modeling) is an amalgam of regular <strong>and</strong><br />

stochastic processes. Information theoretic description of this spectrum in terms of dynamical<br />

entropies measures raw diversity of temporal patterns. The dynamical entropies however do not<br />

measure directly the computational effort required in modeling the complex behavior, which is<br />

what statistical complexity captures.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!