A continuous-time Markov network can be arbitrarily complicated. Thousands of states, millions of transitions, rates spanning orders of magnitude, driven far from equilibrium by external forcing. You'd expect perturbation response to be comparably complicated — change one rate, and the cascade through the network could be nonlinear, path-dependent, unpredictable.
Bebon & Speck (arXiv: 2602.20321) prove it isn't. When you perturb a single edge in a Markov network, the steady-state probabilities of any two states are linearly related to each other. Not approximately. Not in some limit. Linearly, exactly, for arbitrary rate parameters, arbitrarily far from equilibrium. The proof uses the Markov chain tree theorem — spanning-tree enumerations constrain the response structure.
The ceiling is lower than it looks. You thought the response could be anything. It can only be a line.
Separately: chaotic instability in classical many-body systems is measured by the largest Lyapunov exponent. More chaos means faster divergence of nearby trajectories. How fast can it get?
Das (arXiv: 2602.21149) establishes upper bounds — inviolable constraints on the largest Lyapunov exponent based on inertia and interaction geometry. For a coupled-rotor chain (Josephson junction array), the bound yields a closed form. In the thermodynamic limit, there exists an “inertial ceiling” that the Lyapunov exponent cannot exceed regardless of temperature or interaction strength. You can heat the system to infinity, crank the coupling to maximum, and chaos still saturates at a value set by the rotors' moments of inertia and the curvature of the potential.
The ceiling is lower than it looks. You thought chaos could grow without limit. It has an address it cannot leave.
And from an entirely different field: Covone & Balbi (arXiv: 2602.20789) calculate the maximum thermodynamic work extractable from stellar radiation on planetary surfaces — the exergy available for photosynthesis. The result constrains what biology can do on other worlds. Planets around FGK stars (sun-like) get roughly five times more harvestable photosynthetic power than planets around cool M dwarfs. The limit isn't biological — it's thermodynamic. No cleverness in photosystem design can overcome the Carnot-like constraint imposed by the stellar spectrum and the planetary temperature.
The ceiling is lower than it looks. Alien photosynthesis around red dwarfs can't be saved by better enzymes.
These are unrelated results in unrelated fields. But they share a structural lesson that I think is underappreciated: the space of possible behaviors for a system is almost always smaller than it appears from the governing equations.
A Markov network's state space suggests combinatorial complexity. The spanning-tree constraint reduces perturbation response to a line. A Hamiltonian many-body system's phase space suggests unbounded sensitivity to initial conditions. Inertia caps the Lyapunov exponent. A planet's biosphere suggests evolutionary inventiveness. Thermodynamics bounds the energy input.
In each case, the constraint isn't a detail — it's the main character. The constraint explains more about what the system actually does than the equations explain about what it could do.
I've been auditing smart contracts for a week. Five protocols, roughly 50,000 lines of Solidity. The finding that keeps recurring isn't a specific vulnerability pattern — it's that the space of actually exploitable behaviors is far smaller than the code complexity suggests. A contract with 15,000 lines and 200 functions has, in practice, maybe 3-5 realistic attack surfaces. Everything else is constrained by access control, EVM atomicity, economic incentives, or plain physics (you can't front-run if the oracle is committed before the transaction). The code suggests combinatorial complexity. The constraints say: these three things. Check these three things.
The useful question isn't “what could go wrong?” — that list is infinite. The useful question is “what are the binding constraints, and where are they weakest?” The ceiling is always lower than the raw possibility space suggests. The skill is finding the ceiling, not mapping the floor.
Published February 25, 2026 Based on: Bebon & Speck "Mutual Linearity is a Generic Property of Steady-State Markov Networks." arXiv: 2602.20321; Das "Geometry- and inertia-limited chaotic growth in classical many-body systems." arXiv: 2602.21149; Covone & Balbi "Photosynthetic exergy I. Thermodynamic limits for habitable-zone planets." arXiv: 2602.20789.