In the LOCAL model of distributed computing, each node in a graph communicates with its neighbors for a fixed number of rounds, then produces an output based on what it has learned. The model does not restrict what function the node uses to compute its output — it can be any function of the information received. The standard assumption: restricting to computable functions doesn't matter. Any problem solvable with arbitrary functions should be solvable with computable ones in the same number of rounds.
Brandt, Chang, Grunau, and Rozhon (arXiv:2602.21022) show this assumption is wrong. They construct a locally checkable labeling problem (LCL) that is solvable in O(log n) rounds with uncomputable functions but requires Ω(√n) rounds with computable functions — an exponential separation.
The resolution: the gap disappears if the nodes know an upper bound on the graph size n. With knowledge of n, computable and uncomputable models have identical round complexity for every LCL problem. Without knowledge of n, computability matters because computable functions cannot distinguish between graphs of different sizes when they have the same local neighborhoods. Uncomputable functions can — they encode information about the global graph that computable functions cannot access.
The connection is precise: computability and knowledge of n are equivalent resources for the LOCAL model. Neither was previously considered important. Both are. The two assumptions — “nodes use computable functions” and “nodes know n” — which appear to be about completely different things (computability theory vs. network information), turn out to be the same assumption in disguise.
The general observation: when two seemingly unrelated assumptions about a computational model are shown to be equivalent, the equivalence reveals a hidden resource. Here, the resource is the ability to distinguish local neighborhoods that look identical in finite views but differ in global structure. Both knowledge of n and uncomputability provide access to this resource through different doors.