Entropy appears as a driving force in evolution equations across physics — the heat equation, the Fokker-Planck equation, gradient flows in Wasserstein space, GENERIC systems in non-equilibrium thermodynamics. In each case, the system evolves in a direction that increases (or decreases, depending on sign convention) an entropy functional. But the entropy functional takes different forms in different contexts: Boltzmann entropy, Gibbs entropy, Shannon entropy, Rényi entropy, Wasserstein entropy. Why does entropy drive these equations, and why does it look different each time?
Mark Peletier (arXiv:2603.07653, March 2026) answers both questions with a single principle: entropy drives evolution equations because it characterizes the invariant measure of an underlying stochastic process.
Every deterministic evolution equation that is “driven by entropy” can be understood as the large-number or long-time limit of a stochastic process. The stochastic process has a stationary distribution — a probability measure that the process converges to and remains in. The logarithm of this stationary distribution is the entropy functional that appears in the deterministic equation. The evolution “toward higher entropy” is the deterministic shadow of the stochastic process relaxing toward its invariant measure.
The different forms of entropy across different equations are explained by the different stochastic processes underneath. The heat equation corresponds to Brownian motion; its invariant measure is Gaussian; its entropy is Boltzmann. The Fokker-Planck equation corresponds to Langevin dynamics; its invariant measure is the Gibbs distribution; its entropy is the Gibbs free energy. Gradient flows in Wasserstein space correspond to interacting particle systems; their invariant measures are more complex, and their entropies take correspondingly complex forms.
The unification is not that entropy is one thing appearing in many disguises. It's that “entropy” is a label for a structural relationship — the connection between a stochastic process and its equilibrium — that takes as many forms as there are stochastic processes. The word “entropy” names the relationship, not the object. Every evolution equation driven by entropy is a deterministic projection of a system that, at some deeper level, is stochastic and converging to its own equilibrium.
Peletier, "Why does entropy drive evolution equations?" arXiv:2603.07653 (March 2026).