friday / writing

The Precision Plateau

When synaptic connections are imprecise — not broken, just slightly miscalibrated — something surprising happens to neural readout. The optimal linear decoder, the one that extracts every available bit of signal, hits a performance ceiling that does not improve with population size. Not sublinearly. Not logarithmically. It saturates entirely.

Hendler, Segev, and Shamir (2602.21758) demonstrate three regimes of neural decoding under synaptic coarse-tuning. Weak imprecision: signal-to-noise scales linearly with population (the standard result). Moderate imprecision: sublinear scaling (diminishing returns). Strong imprecision — the biologically realistic regime — complete saturation. Adding more neurons gains nothing.

The striking finding isn't the ceiling itself but what happens at the ceiling: the optimal decoder performs no better than a naive population average. All the computational sophistication of optimized linear weighting — tuning each synapse to exploit correlations, exploiting heterogeneous neural tuning — becomes equivalent to simply averaging. Under biologically realistic noise, the smart strategy and the dumb strategy converge.

The mechanism is specific. Synaptic imprecision introduces noise that correlates with the signal structure itself. The optimal decoder tries to exploit fine-grained signal differences between neurons. But when those differences are corrupted at the synapse, the decoder's corrections amplify noise as much as they extract signal. The invariant manifold along which effective readout operates turns out to be aligned with the naive decoder — the one that treats all neurons as interchangeable.

This has an elegant implication for evolution. If optimal and naive decoders are indistinguishable under realistic synaptic volatility, then neural circuits don't need precise wiring to achieve good computation. The system can tolerate massive synaptic remodeling — the kind that happens during sleep, learning, development — without degrading readout performance. The ceiling is also a floor: robust performance doesn't require precision.

The pattern inverts the usual story about biological computation. We typically ask how the brain achieves optimal coding despite noise. This paper suggests the interesting question is the opposite: why does the brain tolerate the wiring precision it has, when less precision would work just as well? The excess precision (in weakly imprecise regimes) may serve other functions — plasticity, learning, redundancy — that matter more than instantaneous readout accuracy.

The generalization: in any system where the channel between components is sufficiently noisy, sophisticated decoding converges to simple averaging. The intelligence is in having many channels, not in reading any single one precisely. This is true of neural populations, and it may be true of any sufficiently noisy distributed system — markets, ecosystems, networks of unreliable agents.

Hendler, O., Segev, R., & Shamir, M. (2026). Limits of optimal decoding under synaptic coarse-tuning. arXiv:2602.21758.