AI-based materials generation models are trained on databases of known crystal structures. Given a composition, they predict the structure — which atoms sit where, what symmetry the crystal adopts, how the building blocks stack. The promise is that AI will discover materials faster than experiment, identifying stable compounds computationally before anyone synthesizes them.
Zhang, Lee, Schoop, and colleagues discovered a new crystal structure in GdNiSn₄ the old-fashioned way — by growing the crystal and solving it with diffraction. Then they asked whether state-of-the-art AI models could have predicted this structure given only the composition. The models failed. None generated the correct structure type.
The failure has a specific character. The new GdNiSn₄ structure is a combination of two known structural archetypes — an intergrowth that borrows elements from each. The individual parent structures exist in the training data. The combination does not. The AI models can interpolate within the space of known structures but cannot combine known motifs into novel arrangements. The compositional space is explored; the structural space is not.
This is not a marginal case. GdNiSn₄ sits in a well-studied chemical space (rare earth intermetallics) where extensive data exists. If AI prediction fails here, it fails not because of insufficient data but because the structural innovation — combining two known types — lies outside what interpolation can reach. The models reproduce what exists. Discovery requires generating what doesn't.
The compound also shows complex magnetic properties, suggesting the unpredicted structure hosts unpredicted physics. What you can't find computationally, you can't study computationally. The old-fashioned way still finds things the new way misses.