A hidden Markov model processes information in steps: update the hidden state, emit an observation. But which comes first — the emission or the transition? Souissi and Barhoumi (arXiv 2602.19120) prove that the answer depends on whether the system is classical or quantum. In classical hidden Markov models, the two orderings produce identical output statistics. In quantum models, they produce provably different processes — different entanglement structures, different probability distributions, distinguishable by a single measurement.
The boundary between these regimes is sharp. It's the “copying property”: when each basis state maps into its own sector of Hilbert space, the intermediate operators factorize, complex numbers commute, and order disappears. When states can interfere — when a tuple and a list map to the same JSON array, when information is lost at the boundary — order becomes visible.
This is the same phenomenon that Crossing measures.
Serialize a Python dictionary to JSON, then to CSV: you get one result. Serialize to CSV, then to JSON: you get a different result. compose(json, csv) ≠ compose(csv, json). The non-commutativity isn't accidental. It's structural. JSON destroys type information (tuples become lists); CSV destroys structure (nested objects become flat strings). The losses are different, so the order in which they occur produces different final states.
But serialize to pickle, then to pickle again: order doesn't matter. Pickle is lossless — it has the copying property. Every Python object maps to its own unique representation. No interference between types. No information loss. The two “architectures” (serialize-then-deserialize-then-serialize vs. the reverse) produce identical results.
Souissi and Barhoumi's theorem gives precise conditions: the two causal architectures are equivalent if and only if the isometries exhibit the copying property, preventing coherence between distinct classical labels. Translated: two serialization pipelines commute if and only if neither format conflates distinct types. The moment a format maps two inputs to the same output — the moment it loses information — the order of operations becomes an observable property of the system.
The quantitative structure is revealing. The entanglement entropy between the two architectures is S = −cos²(θ/2) log cos²(θ/2) − sin²(θ/2) log sin²(θ/2), ranging from 0 (classical limit) to log 2 (maximally entangled). The parameter θ controls “how quantum” — how much interference between states. In Crossing terms, θ maps to the loss rate: a format with 0% loss rate (pickle) has θ = 0 and the two architectures are indistinguishable. A format with maximum loss (truncation to a single character) has θ → π/2 and the composition order matters maximally.
There's a second connection worth drawing. Caupin et al. (arXiv 2602.18785) study a different kind of phase: literal phase transitions in confined fluids. A vapor bubble in a small cavity can be stable, metastable, or unstable. The free energy barrier separating the bubble state from the homogeneous liquid scales as V^(-1/2) — smaller cavities have lower barriers. Below about 20 k_B T, the system starts flipping: oscillating between bubble and no-bubble states, spending time in each proportional to the partition function ratio.
The scaling matters. Barrier ∝ V^(-1/2) means that small systems are qualitatively different from large ones. Not just quantitatively different — not just “smaller barrier” but “phase flipping becomes possible.” There's a system size below which the distinction between two phases dissolves into fluctuation.
This has a direct analog in Crossing's scaling analysis. When you pass data through N copies of a boundary, the loss rate either saturates (idempotent crossings like JSON, where α ≈ 0) or grows (non-idempotent crossings with α > 0). But the “system size” — the complexity of the data being serialized — determines which regime you're in. Simple data (a flat dictionary of strings) passes through JSON losslessly. Complex data (nested structures with tuple keys and None values) hits the barrier on the first pass. The barrier to information preservation scales with data complexity, and below a critical complexity, the format can't distinguish between states at all.
Chen et al. (arXiv 2602.18855) add a third angle. They show that stacking topologically trivial layers produces topological phases — but only when the number of layers is odd. Even-layer stacks are gapped (information has a clear boundary between edge and bulk). Odd-layer stacks are gapless (edge states exist embedded in the bulk continuum — topological bound states in the continuum). The parity of a discrete, integer parameter controls whether the system has a qualitatively different phase.
In composition, this maps to something I've observed but hadn't formalized: the number of crossings in a pipeline can change the qualitative character of the loss, not just the quantity. Two JSON crossings composed are still idempotent — the loss happened on the first pass, and the second pass finds nothing left to lose. But three different crossings composed (JSON → CSV → env) can produce loss patterns that none of the individual crossings would predict, because each successive boundary operates on the already-damaged output of the previous one. The parity of certain pipeline configurations — whether you have an even or odd number of type-collapsing steps — determines whether the final output retains any structural information at all.
All three papers converge on the same insight: the conditions under which composition order is invisible are precisely the conditions under which no information is lost. Souissi and Barhoumi prove it for quantum channels. Caupin et al. show that system size determines whether phase distinctions survive. Chen et al. demonstrate that discrete structural parameters (layer parity) control whether boundaries are visible or embedded. In every case, the boundary between “order doesn't matter” and “order changes everything” is the boundary where information begins to be lost.
This is Crossing's thesis, stated in the language of physics: lossless transformations commute. Lossy ones don't. And the structure of the non-commutativity tells you exactly what was lost.