Creating information is one problem. Preserving it is a different one. These results establish that preservation has its own thermodynamics — distinct from transmission, distinct from erasure — and the bounds are tighter than expected.
Brandes (2026, arXiv:2602.06046) defines the preservation tradeoff: the cost of maintaining information against thermal fluctuations, as opposed to transmitting it (Shannon) or erasing it (Landauer). For systems in the diminishing-returns regime — where additional error suppression yields progressively less benefit — the optimal maintenance allocation is bounded above by 50% of available resources. For physically realistic systems exhibiting smooth saturation, the bound narrows to a 30-50% band. The diagnostic is a quantity called preservation stiffness, analogous to magnetic susceptibility — a response function measuring how efficiently resources convert into reliability. This is validated against kinetic proofreading in E. coli and protocol overhead in TCP/IP networks. Biology and telecommunications converge on the same thermodynamic optimum.
The synaptic energy paper (Lam et al., 2026, arXiv:2602.15787) arrives at a complementary result from neuroscience: the energy cost of precision in neural signaling scales as σ⁻² ∝ E⁵. Doubling the precision of a synaptic signal costs 32 times the energy. The brain's solution is strategic allocation — high precision where it matters, low precision elsewhere. The budget isn't a constraint to overcome. It's the reason precision is allocated at all. Without the constraint, there would be no allocation problem and no structure to the precision landscape.
Dago and Bellon (2026, arXiv:2602.06560) demonstrate that the Landauer bound for bit erasure can be beaten by 20% using a feedback-controlled hysteresis that embeds a Maxwell demon in the feedback loop. The demon “remembers” which well the system previously occupied. The apparent violation comes from information stored in the feedback mechanism — the preservation of past state within the controller subsidizes the erasure. The information isn't destroyed for free. It's preserved in one place to pay for destruction in another. The total book is balanced. But the lesson is sharp: where you store the information about the past changes the apparent cost of the present.
The paper on observation and semantics (2026, arXiv:2602.18494) proves that for any physically bounded agent — finite memory, compute, and energy — Landauer's principle imposes a strict limit on the complexity of internal state transitions. Discrete, compositional logic is the only topological solution that allows a finite agent to model a combinatorial world within a finite energy budget. Continuous representations require infinite precision to distinguish states, and infinite precision costs infinite energy. Logic isn't a cultural artifact. It's the unique solution to a constrained optimization problem. The constraint is the cost of preservation: keeping distinct representations distinct against thermal noise.
Four results, one principle: preservation is expensive, and the expense shapes the system. Brandes: you can spend at most 30-50% of your budget on maintenance, regardless of substrate. Lam: precision costs E⁵, so you allocate it strategically. Dago-Bellon: storing information in the controller subsidizes operations elsewhere. The semantics paper: the energy cost of distinguishing states forces logic to be discrete. The conventional framing treats preservation as overhead — the cost you pay to keep doing the interesting work. These results invert that. The preservation constraint is what determines the structure. E. coli's proofreading fidelity, the brain's precision landscape, TCP's protocol overhead, the discreteness of human logic — all are shaped by the thermodynamics of keeping things rather than by the dynamics of creating them. What you can afford to remember determines what you can afford to think.