Computational Mechanics is an extension of the approaches typically found in statistical mechanics. It is a technique to describe both temporal and spacial organization in systems. It is a more detailed structural analysis of behavior than those captured solely in terms of probability and degrees of randomness. It complements and augments more traditional techniques in statistical physics by describing the structural regularies inherent in the system or data. Beyond focusing on measures of disorder, computational mechanics aims to detect and quantify structure in natural processes. Natural information processing mechanisms can be represented and analyzed in a diverse set of model classes. Contemporary notions of computation and of useful information processing must be extended in order to be useful within natural processes and their instantiation in multiagent systems. This is because natural processes are systems that can be spatially extended, continuous, stochastic, or even some combination of these and other characteristics fall often outside the scope of discrete computation theory. Computational mechanics attempts to construct the minimal model capable of statistically reproducing the observed data. This is called an εmachine or causal state machine. This approach leads one naturally to address the issue of the information processing capability of the system. There is an vast amount of literature on computational mechanics. This includes both, theoretical developments as well as system applications. Jim Crutchfield maintains a rather exhaustive page on computational mechanics called the Computational Mechanics Archive, and almost all papers of interest are listed there. εmachine (causal state machine)The εmachine is a mathematical object that embodies the Occam's Razor principle in that its description of the process has minimal statistical complexity subject to the constraint of maximally accurate prediction. The εmachine provides
εmachine reconstruction The problem is inferring εmachines from limited data is called εmachine reconstruction. The focus is on developing inference algorithms (statemerging versus statecreating) and accompanying statistical error theories, indicatiing how much data and computing time is required to attain a given level of confidence in an εmachine with an finite number of causal states.causalityCausality is defined in temporal terms: Something causes something else by preceding it. A sequence of causal states has the property that causal states at different times are independent, conditioned on the intermediate causal states. This is a Markov chain: In knowing the whole history of the process up to a point has no more predictive capacity than knowing the causal state it's in at that point. The causal structure is the equivalence relation causal states establish among different histories which have the same conditional distribution of future observables and the symbol emitting state transition probability. A directed graph where states are represented as nodes and edges labeled with emitted symbols and transition probabilities represent stete transitions can be used to visualize the causal information in the εmachine.type of raw data for inferring causality (suitable processes)Processes best captured by this method are sequences of a symbolic logic character, with complex pattern and an unknown generative mechanism, such as an extremelylong string on nottotallyrandom 1's and 0's. Processes are assumed to be stationary, however online reconstruction can relax this constraint. Algorithmic complexity of the reconstruction increases exponentially with the size of the symbol set. The even process is reconstructed (using CSSR) from sequences of 10000 samples with correct causal states and transition probabilities within 3% of the correct probabilities. The even process is a process which generates a string of 1's and 0's according to the rule that after a 0 is emitted, there is a coin flip to see whether a 1 or 0 follows. After an odd number of 1's being emitted, the process must emit a 1, but after an even number of 1's it is back to the coin flip. Randomness is embodied in the minimally stochastic transitions form one state to another. The given state can lead to many states, but a given symbol leads to at most one of those states.limitationsUnlike for graphical causal models a "factor" is not a meaningful term for the definition of an εmachine. In computational mechanics, the causal state definition sidesteps this, because it doesn't matter what determines the causal state to be a causal state, it just depends on the future morph of a given history.Pearl introduces the notion of "stability" (robustness of independence patterns) is not satisfied in εmachine reconstruction. This is when the model's structure can retain its independence pattern in the face of changing parameters. It is unclear in this context what the "parameters" actually are.

Announcements >