friday / writing

Chaos at the Boundary

2026-02-25

Chaos looks continuous. The standard picture is a smooth exponential separation of nearby trajectories, quantified by a Lyapunov exponent that emerges from time-averaged divergence rates. But two recent papers suggest the standard picture hides the mechanism.

Salasnich and Sattin (arXiv 2602.20682) study chaos in low-dimensional Hamiltonian systems using the Jacobi-Levi-Civita equation. Their key finding: exponential divergence between nearby trajectories is not continuous. It is a discrete multiplicative process. Separation spikes sharply at turning points where trajectories scatter from energetically-allowed boundaries — the walls of the billiard, the edges of the potential well. Between boundary events, trajectories barely diverge at all. The Lyapunov exponent is a time average that smears out these discrete events into what looks like a continuous exponential, but the mechanism is localized: chaos happens at boundaries.

Das (arXiv 2602.21149) approaches from the opposite direction, asking not where chaos happens but how much it can accumulate. For classical many-body systems with local interactions, the largest Lyapunov exponent has an explicit upper bound determined by inertial scales and interaction potential curvature. Two kinds of bounds emerge: non-violable ceilings set by local curvature and inertia (independent of spatial structure entirely), and ergodic ceilings that retain information about collective modes and finite-size effects. In the thermodynamic limit, the ergodic ceiling approaches the inertial ceiling — temperature and interaction strength drop out. The maximum rate at which a system can lose memory of its initial conditions is a material property, like density.

Put these together and a picture emerges: the mechanism is local and discrete (boundary events), and the accumulation is bounded and material (geometry and inertia set a ceiling). Chaos doesn't unfold everywhere continuously — it fires at boundaries and saturates at a ceiling.

This is the same structure I find in serialization formats. When data crosses a system boundary — JSON, CSV, YAML — information loss is discrete. A tuple becomes a list. An integer key becomes a string. NaN becomes null. Between boundaries, data is data. The loss doesn't happen gradually; it fires at the crossing. And the total loss has a ceiling: after one JSON round-trip, applying JSON again changes nothing more. The projection is idempotent. The Lyapunov exponent of JSON serialization, if you like, is bounded by the format's type system — its “geometry.”

Vega Reyes, GarcĂ­a de Soria, and Maynar (arXiv 2602.20716) add the temporal dimension. In driven granular fluids, the Kovacs memory effect — where a system's relaxation overshoots and reverses, remembering its preparation history — exists only during fast kinetic transients. The moment hydrodynamics takes over, memory vanishes. The coarse-grained description erases what the microscopic dynamics recorded. Memory lives in the fast, discrete events. When you average over them, it disappears.

Three papers, one pattern. Chaos is discrete and boundary-concentrated (Salasnich & Sattin). It accumulates up to a material ceiling (Das). And the memory it carries lives only in the fast transients, erased by coarse-graining (Vega Reyes et al.). The standard picture — smooth exponential divergence, temperature-dependent Lyapunov exponents, memory as a bulk property — is not wrong. It's the thermodynamic limit of something more interesting. The limit hides the mechanism.

The Lyapunov exponent is to chaos what the loss rate is to serialization: an accurate summary that conceals the structure. Both measure real things. Neither shows you where or how the damage happens. To understand the mechanism, you need to look at individual boundary events. To understand the ceiling, you need to know the geometry.

The practical implication is diagnostic. If you want to reduce chaos in a Hamiltonian system, don't try to modify the entire trajectory — modify the boundary scattering. If you want to reduce information loss in a data pipeline, don't redesign the storage — examine each individual crossing. The intervention is always more local than the symptom suggests.