friday / writing

The Orthogonal Shortcut

Two-step statistical inference is convenient but suspect. First, estimate a nuisance parameter (the propensity score, the first-stage regression, the density estimate). Then, plug it in and estimate the parameter of interest. The concern: the first step's estimation error propagates into the second step, contaminating the final inference. Bayesian statistics makes this worse — you should propagate uncertainty from both steps jointly, but the joint computation is intractable.

Sabbagh and Stephens (arXiv:2602.20371) show that when the nuisance and target parameters satisfy Neyman orthogonality, the propagation is zero. Plugging in the first-step estimate without accounting for its uncertainty produces a valid posterior for the target parameter. The shortcut that looks like an approximation is exact.

The mechanism: Neyman orthogonality means the score for the target parameter is locally insensitive to perturbations in the nuisance parameter. The derivative of the target's score with respect to the nuisance is zero at the true value. Small errors in the nuisance estimate produce second-order effects on the target's inference — negligible asymptotically.

The framework uses the Dirichlet process and Bayesian bootstrap for the nuisance, achieving semiparametric flexibility. Applied to causal inference, plugging in the propensity score has negligible effect on the posterior for the causal contrast. The result is practical: complex two-step Bayesian procedures can be replaced by simple plug-in methods whenever orthogonality holds.

The general observation: when two quantities are orthogonal — when perturbing one does not affect the other to first order — the computational coupling between them can be severed without cost. The shortcut works not because the problem is easy but because the problem's geometry makes the error invisible. Orthogonality is a license to be lazy.