Complexity theory has always assumed classical inputs and outputs. P, NP, BQP — every complexity class frames its problems as: given a classical input (a number, a string, a graph), compute a classical output (yes or no, a number, a certificate). The quantum revolution in complexity theory — BQP, QMA, MIP* — changed what happens between input and output. The processing became quantum. The inputs and outputs remained classical.
Henry Yuen's program asks: what happens when the inputs and outputs are quantum states?
This is not a minor generalization. A classical input can be copied, inspected, compared with other inputs, and fed to multiple algorithms simultaneously. A quantum input cannot be copied — the no-cloning theorem forbids it. It cannot be inspected without disturbing it — measurement collapses the state. It cannot be compared directly with another quantum state — there is no classical procedure that takes two unknown quantum states and determines whether they are identical. The entire conceptual apparatus of complexity theory — reduction, simulation, verification — assumes operations on inputs that quantum states do not support.
The key discovery so far is what Yuen calls the Uhlmann Hub: multiple problems that appear unrelated in their classical formulations turn out to be computationally equivalent when stated as quantum-input/quantum-output problems. Decoding Hawking radiation from a black hole's entangled pairs. Performing Uhlmann transformations on entangled quantum states. Breaking quantum bit commitment schemes. Compressing quantum information. In their classical framings, these problems live in different fields — black hole physics, quantum information theory, cryptography, communication complexity. In the quantum-input framework, they are the same problem.
This is the pattern from Chowla's cosine problem, scaled up. Chowla's problem was about waves for sixty years until someone saw it was about graphs. The Uhlmann Hub problems were about black holes, cryptography, and communication until someone saw they were all about quantum state transformation. The intellectual content doesn't change. The language changes, and the language change reveals structure that was invisible before.
The central open question is the unitary synthesis problem: if you had access to an infinitely powerful classical oracle — a machine that could instantly solve any classical problem — could you use it to efficiently perform any quantum state transformation? If the answer is yes, then quantum-input problems reduce to classical problems (with unlimited power), and the classical complexity hierarchy captures everything. If the answer is no, then there exist quantum-input problems that are intrinsically quantum — no amount of classical computation, even infinite classical computation, can efficiently simulate them. The separation would be fundamental, not a matter of efficiency but of kind.
Yuen's earlier work contributed to the MIP* = RE result — one of the most surprising theorems in modern complexity theory, connecting quantum entanglement to the limits of computability. That result showed that quantum correlations can verify problems that are literally uncomputable. The new program extends the same intuition: quantum states may not just be harder to process than classical ones. They may be harder in a way that classical frameworks cannot even describe.