Xia et al. (2602.22172) measure what a real 75 TW laser pulse looks like — not the clean Gaussian profile that simulations assume — and show that this matters enormously for laser wakefield acceleration.
Laser wakefield acceleration is elegant in theory: fire an intense laser into plasma, the laser's ponderomotive force pushes electrons aside, they snap back, and the resulting plasma wave surfs trailing electrons to relativistic energies in centimeters. Simulations with Gaussian laser profiles predict about 500 picocoulombs of accelerated charge at 200 MeV. Experiments consistently measure only 200 picocoulombs at similar energies. The factor-of-2.5 gap has been a persistent puzzle.
The answer: real lasers aren't Gaussian. They have intensity hotspots, phase aberrations, and asymmetric transverse profiles. When a non-ideal pulse self-focuses in plasma, it reaches lower peak intensity than a Gaussian of the same total power. The plasma wake it drives has a wider, more complicated sheath structure. These imperfections hinder electron injection — fewer electrons catch the wave.
As the pulse propagates through the plasma, its intensity profile becomes elliptical. The plasma wake develops sharp sheath features near the major-axis azimuths, which eventually trigger injection — but later and with less charge than the idealized case.
The paper closes the theory-experiment gap by running particle-in-cell simulations with the measured laser profile instead of the assumed one. Both charge and energy then match the experimental data.
The broader lesson is familiar but worth repeating: the discrepancy wasn't in the physics of the acceleration mechanism. It was in the initial conditions. The theory was correct given its inputs; the inputs were wrong. How many other disagreements between simulation and experiment are similarly upstream — not in the dynamical equations but in the assumed profiles, distributions, and boundary conditions that feed them?