This AI has just solved a century-old physics problem in a matter of seconds

equation solving

For over a hundred years, a single equation sat at the heart of materials science โ€” elegant in theory, brutal in practice. The configurational integral, formulated by Boltzmann and Gibbs in the 19th century, describes how atoms arrange themselves inside a material and how much energy flows between them.

Solve it directly, and you unlock the ability to predict how metals deform, melt, or shift crystal structure under pressure. The catch: for even a small crystal of a few dozen atoms, the calculation involves thousands of interlocked variables. Classical integration techniques would require computational time exceeding the age of the universe.

So physicists gave up on solving it directly. For decades, the field relied on indirect workarounds โ€” molecular dynamics and Monte Carlo simulations that approximate atomic behavior by brute-forcing enormous numbers of interactions over extended time periods. Even running on the world’s fastest supercomputers, these methods take weeks and still return only approximate answers. A new framework from Los Alamos National Laboratory and the University of New Mexico just changed that calculus entirely.

THOR โ€” Tensors for High-dimensional Object Representation โ€” is an AI-powered computational framework that solves the configurational integral directly, not approximately, and does it hundreds of times faster than any existing method.

The Curse of Dimensionality, Defeated

The fundamental obstacle in computing configurational integrals is what physicists call the “curse of dimensionality.” Every atom in a material adds variables to the problem, and complexity grows exponentially as those variables multiply. A moderately-sized crystal already produces a calculation space so vast that it defies direct computation โ€” not because of limited hardware, but because the mathematical structure itself scales beyond reach.

THOR sidesteps this wall using a technique called tensor train cross interpolation. Rather than trying to evaluate the full high-dimensional integrand at once, the framework decomposes the massive dataset into a sequence of smaller, connected computational pieces. Each piece is manageable; strung together, they reconstruct the full answer.

The team, led by senior AI scientist Boian Alexandrov at Los Alamos and Professor Dimiter Petsev at UNM, also built in a specialization that detects crystal symmetries within the material โ€” reducing the computational load even further by avoiding redundant calculations that symmetry renders unnecessary.

๐Ÿ’ก Key Insight

THOR doesn’t approximate the configurational integral โ€” it computes it directly using tensor network decomposition. This is a qualitative shift, not just a speed improvement: the answer is structurally more accurate than anything indirect simulation can produce.

Validated on Real Materials, Including a Notoriously Difficult Phase Transition

The team benchmarked THOR against three test cases chosen to stress-test its accuracy. Copper served as the baseline metal. Crystalline argon under extreme pressure tested behavior at physical extremes.

The most demanding case was tin, which undergoes a complex solid-to-solid phase transition โ€” a structural rearrangement between two distinct crystal configurations that is notoriously difficult to model accurately.

In every test, THOR’s results matched those produced by established Los Alamos simulation methods. The difference was in time. Computing tin’s phase diagram with conventional molecular dynamics required roughly 2,560 hours of processing. THOR produced the same result in under 6 hours โ€” a speedup factor exceeding 400x. For copper and argon, the accuracy-to-speed trade-off was equally favorable.

Material Challenge Speedup vs. Classic Methods
Copper Baseline metal thermodynamics 400x+
Argon (crystalline) Extreme pressure conditions 400x+
Tin Solid-solid phase transition (2,560h โ†’ <6h) 400x+

Machine Learning Potentials Built In

THOR isn’t just a faster version of existing physics solvers โ€” it integrates with modern machine learning atomic models that capture how atoms interact and move. This makes the framework flexible across physical environments: different temperatures, pressures, and material compositions can all be analyzed without rebuilding the pipeline from scratch. The combination of tensor network mathematics and ML potentials is what gives THOR its broad applicability across materials science, physics, and chemistry.

โ†’ What this means for researchers

Simulations that previously consumed weeks of supercomputer time can now run in hours on more accessible hardware. For materials scientists studying extreme conditions โ€” high-pressure metallurgy, phase transitions, novel alloys โ€” this compresses research timelines dramatically. The code is available open source on GitHub.

One Open Question Remains

THOR’s tensor network approach exploits the regular, repeating symmetries of crystalline solids. That’s a meaningful constraint. Liquids, amorphous materials, and complex alloys lack this ordered structure โ€” and it’s not yet clear whether the method can be extended to handle them. The research team acknowledges this as an open frontier. Whether tensor decomposition can be adapted for disordered systems without losing its speed advantage will likely define the next phase of this work.

For crystalline materials, though, the case is already closed. A calculation that physics declared practically impossible for a century has been reduced to a tractable problem. THOR doesn’t just accelerate simulations โ€” it replaces a class of approximations with something more fundamental. That’s the kind of step change that tends to compound over time as researchers build on it.


Sources
The University of New Mexico / Science News, “THOR AI solves a 100-year-old physics problem in seconds” (March 2026)
Duc Truong et al., Physical Review Materials (2025)

alex morgan
I write about artificial intelligence as it shows up in real life โ€” not in demos or press releases. I focus on how AI changes work, habits, and decision-making once itโ€™s actually used inside tools, teams, and everyday workflows. Most of my reporting looks at second-order effects: what people stop doing, what gets automated quietly, and how responsibility shifts when software starts making decisions for us.