About the Author

Dynamics, Information, and Organization: The Origins of Computational Mechanics

By James P. Crutchfield

Computational mechanics defines pattern and structure with the goal of detecting and quantifying the organization of complex systems. The field developed from methods introduced in the 1970s and early 80s to (i) identify strange attractors as the mechanisms that drive weak fluid turbulence via the reconstruction of attractor geometry from measurement time series, and (ii) estimate effective theories directly from complex time series. Such estimation foundered in selecting a representational basis without first-principle guidance. Computational mechanics addressed this weakness by providing a mathematical and operational definition of structure. The result is a principled means of discovering patterns in natural systems. Its applications include information measures for complex systems, a structural hierarchy of intrinsic computation, quantum compression of classical processes, intelligence in Maxwellian demons, and evolution of computation and language.

The rise of dynamical systems theory and the maturation of the statistical physics of critical phenomena in the 1960s and 70s led to a new optimism that complicated and unpredictable occurrences in the natural world were actually governed by simple, nonlinearly-interacting systems. Moreover, new mathematical concepts and increasingly powerful computers provided an entrée to understanding the emergence of such phenomena over time and space. The overarching lesson was that intricate structures in a system’s state space amplify microscopic uncertainties, guiding and eventually attenuating them to form complex, spatiotemporal patterns. In short order, this new perspective on complex systems raised questions surrounding the quantification of their unpredictability and organization.

By themselves, qualitative dynamics and statistical mechanics were mute to this challenge. The first hints of addressing this lay in Andrey Kolmogorov’s introduction of computation theory and Claude Shannon’s information theory into continuum-state dynamical systems. These innovations demonstrated that information has an essential role in physical theories of complex phenomena — a role that is complementary to (and as equally important as) energy. They yielded a new algorithmic foundation to randomness generated by physical systems: incompressible behavior is random. A bona fide measure of complex systems’ unpredictability was thus established.

Yet information generation comprises only one aspect of complex systems. How do such systems store and process that information? How is that information expressed and remembered in structure? The first uses of information and algorithmic concepts side-stepped questions concerning the structure and organization of complex systems’ internal mechanisms. Delineating their informational architecture is a subtle task.

Even when we know their governing equations of motion, truly complex systems generate patterns over long temporal and spatial scales. For example, the Navier-Stokes partial differential equations describe the local-in-time-and-space balance of forces in fluid flows. A static pressure difference leads to material flow. However, the Navier-Stokes equations themselves do not directly describe fluid structures such as vortices, vortex pairs, vortex streets, or vortex shedding, let alone turbulence. These patterns are emergent, and are generated at spatiotemporal scales far beyond those directly specified by the local, instantaneous equations of motion.

Two questions pertaining to emergent patterns immediately arise, which is where the subtlety comes into play. We see that something new has originated, but how do we objectively describe its structure and organization? More prosaically, how do we discover patterns in the first place?

Figure 1. Causal equivalence future conditional distributions. Figure courtesy of James Crutchfield.
By refining the reconstruction methods developed to identify chaotic dynamics in fluid turbulence, computational mechanics provided an answer that was both simple and complete: a complex system’s architecture lies in its causal states. A causal state is a set of histories, each of which leads to the same set of futures (see Figure 1). It is a simple dictum — do not distinguish histories that point to identical predictions of the future.

The causal states and their corresponding transition dynamics yield a canonical representation: the \(\epsilon\)-machine. A system’s \(\epsilon\)-machine is its unique optimal predictor of minimal size. The historical information stored in the causal states of a process quantifies its level of structure. A process’s \(\epsilon\)-machine is its effective theory — its equations of motion at the level of emergent patterns. Focusing only on optimal process prediction leads to a notion of structure in terms of stored information and symmetry; this is a notable aspect of the \(\epsilon\)-machine’s construction. Predictability and organization are inextricably intertwined. Researchers cannot discuss or properly measure one without reference to the other.

A system’s \(\epsilon\)-machine minimal representation solves the challenge of quantifying emergent organization. The answer lies in a complex system’s intrinsic computation, which addresses three simple questions: 

  1. How much of the past does a process store? 
  2. In what architecture is that information stored? 
  3. How is the stored information used to produce future behavior?

The answers are straightforward. The stored information is in the causal states, the \(\epsilon\)-machine’s states and transitions explicitly lay out the process architecture, and the process’s Shannon-Kolmogorov entropy rate monitors information production.

At first blush it may not be apparent, but computational mechanics parallels basic physics in this way. Physics tracks various types of energy and monitors their transformation into each other. Similarly, computational mechanics explores the kinds of information inherent in a system and the ways in which such information transforms into other variations. Although the \(\epsilon\)-machine describes a mechanism that generates a system’s statistical properties, computational mechanics captures more than mere generation. And this is why the field was so named. It was an extension of statistical mechanics that went beyond analysis of a system’s statistical properties to capture its computation-theoretic characteristics: how a system stores and processes information, and how it intrinsically computes.

A synopsis of the main concepts underlying computational mechanics necessarily neglects its intellectual history. From where did this mix of ideas originate? What is their historical context? What problems drove their invention? Revisiting the conditions that inspired computational mechanics reveals how this history resonates with the ensuing science.


Part II of this article, to be published in the November issue of SIAM News, will detail the author’s personal interest in computational mechanics and its extensions and applications to nonlinear physics.

Acknowledgments: I thank the Santa Fe Institute, the Telluride Science Research Center, and the California Institute of Technology for their hospitality during visits. This material is based on work supported by, or in part by, Foundational Questions Institute grant FQXi-RFP-1609, the U.S. Army Research Laboratory and the U.S. Army Research Office under contract W911NF-13-1-0390 and grant W911NF-18-1-0028, and via Intel Corporation support of CSC as an Intel Parallel Computing Center.

Further Reading
[1] Aghamohammadi, C., Mahoney, J.R., & Crutchfield, J.P. (2017). The ambiguity of simplicity. Phys. Lett. A, 381(14), 1223-1227.
[2] Crutchfield, J.P. (1994). Is anything ever new? Considering emergence. In G. Cowan, D. Pines, & D. Melzner (Eds.), Complexity: Metaphors, Models, and Reality (pp. 479-497). Santa Fe Institute Studies in the Sciences of Complexity (Vol. XIX). Reading, MA: Addison-Wesley.
[3] Crutchfield, J.P. (1994). The calculi of emergence: Computation, dynamics, and induction. Physica D, 75, 11-54.
[4] Crutchfield, J.P. (2012). Between order and chaos. Nat. Phys., 8, 17-24.
[5] Crutchfield, J.P., & McNamara, B.S. (1987). Equations of motion from a data series. Complex Syst., 1, 417-452.
[6] Crutchfield, J.P., & Young, K. (1989). Inferring statistical complexity. Phys. Rev. Lett., 63, 105-108.
[7] James, R.G., & Crutchfield, J.P. (2017). Multivariate dependence beyond Shannon information. Entropy, 19, 531.
[8] Packard, N.H., Crutchfield, J.P., Farmer, J.D., & Shaw, R.S. (1980). Geometry from a time series. Phys. Rev. Lett., 45(9), 712.

James P. Crutchfield teaches nonlinear physics at the University of California, Davis, directs its Complexity Sciences Center, and promotes science interventions in nonscientific settings. He is mostly concerned with  patterns: what they are, how they are created, and how intelligent agents discover them. His website is http://csc.ucdavis.edu/~chaos/.