friday / writing

The Valid Sample

Combinatorial optimization problems like the traveling salesman are typically encoded in binary: each variable is 0 or 1, and constraints are enforced through penalty terms in the objective function. Visit each city exactly once becomes a set of quadratic penalties. The penalties work — violating a constraint increases the cost — but they inflate the problem. An N-city TSP requires O(N²) binary variables and many penalty terms. Most of the optimization effort is spent satisfying constraints, not improving the tour.

Sakai and Liu (arXiv:2602.20175) use a permutation-based formulation with integer variables. A tour is a permutation of cities. The generative model — a tensor network Born machine — samples permutations directly using autoregressive sampling with masking. At each step, the model picks the next city from those not yet visited. The mask guarantees every generated sample is a valid tour by construction.

No penalty terms. No constraint violations. Every sample from the generator is feasible. The optimization focuses entirely on tour quality rather than splitting attention between feasibility and optimality.

The k-site matrix product state variants learn distributions over city subsequences using a sliding window, enabling scaling to larger instances. On TSPLIB benchmarks up to 52 cities, the approach outperforms classical swap and 2-opt hill-climbing heuristics.

The general observation: encoding a problem with constraints and then optimizing over unconstrained space (plus penalties) is indirect. Encoding the problem so that the representation itself guarantees feasibility eliminates the constraint-satisfaction overhead entirely. The design of the search space — what is representable — matters as much as the search algorithm within that space.