Collective behavior is usually explained by interaction. Fish school because they sense their neighbors' velocities and adjust. Neurons synchronize because they're synaptically coupled. Spins align because exchange interactions favor parallel orientations. The theoretical apparatus — from Ising models to agent-based simulations — centers on pairwise or higher-order couplings between components. Remove the interactions, and the collective behavior should vanish.
Abril-Bermudez, Fisher, Gramain, and Perez-Reche (arXiv 2602.15256, February 2026) show this isn't necessarily true. Using a minimal three-variable stochastic model with zero direct interactions, they demonstrate that genuine higher-order collective structure — not just pairwise correlations, but multi-body statistical dependencies that can only be seen by observing the whole system — can emerge purely from a shared fluctuating environment.
The measure they use is O-information, an information-theoretic quantity that distinguishes redundancy (positive O-information: variables carry overlapping copies of the same information) from synergy (negative O-information: the system carries collective information absent from any individual part). The distinction matters. Redundancy means knowing one variable tells you about the others. Synergy means you must observe the whole to access what the system knows — it's the statistical signature of genuine collective organization.
Their central result is a no-go theorem. If components couple to a shared environment with constant coupling strengths — a static shared input — then only redundancy is possible. The environment broadcasts the same signal to everyone, and the induced correlations are structurally constrained to the redundant region. It's like everyone reading the same newspaper: they'll have correlated opinions, but the correlations are copies, not collective computation.
The escape requires time-dependent coupling. When different variables couple to the environment with different temporal profiles — one responds quickly, another slowly, a third with a delayed peak — the induced correlations can reach the synergistic region. The asymmetry in temporal response creates correlation structures that couldn't arise from a simple common cause. The distinction is between weather that falls on everyone equally (redundancy) and weather that different agents experience at different intensities and times (synergy).
The implication is a fundamental ambiguity in inference. If you observe a system exhibiting higher-order collective organization — synergistic information, correlated fluctuations beyond pairwise — you cannot determine from the statistics alone whether it arises from direct interactions or shared environmental forcing. The same collective signatures have two completely different mechanistic explanations, and no amount of correlation analysis will distinguish them without additional information about the system's structure.
This matters for ecology, where the Moran effect — spatially separated populations synchronizing through shared climate — is well-documented but its implications for higher-order dependencies are unexplored. It matters for neuroscience, where common synaptic input from upstream populations can mimic the statistical signatures of local circuit interactions. And it matters for any field that tries to infer interaction networks from observed correlations.
The paper also shows that when direct interactions and environmental coupling coexist, the effects are non-additive. Positive triplet interactions can reduce redundancy when combined with environmental forcing, rather than reinforcing it. The interaction between the two mechanisms is not simple summation — it's a nonlinear interplay that can push the system in unexpected directions.
Collective behavior doesn't require collectivity in the mechanism. A shared, fluctuating world is enough — provided the fluctuations have the right temporal structure.