DayFR Euro

A new approach to evaluate the progress of simulation techniques in quantum physics

In an article published in the journal Science, an international collaboration, in which researchers from the Center for Theoretical Physics participate, offers a tool to estimate the difficulty of many problems in quantum physics that remain to be solved as well as the effectiveness of the methods developed to tackle them, including quantum algorithms.

Knowing the properties of materials is an essential issue, both at the fundamental level and at the applied level. These properties come from the behavior of the elementary constituents of materials, notably atoms and electrons. The rules that individually govern the behavior of particles have been known for almost a century thanks to quantum physics, which has enabled phenomenal progress in this field. However, many problems remain difficult to solve, in particular when it is necessary to take into account the interaction of a large number of particles with each other. This is what scientists call the “quantum many-body problem”, at the heart of the article which has just been published in Science.

“For each particle that is added to the problem, all mutual interactions with all other particles must be taken into account in the calculation. Its complexity therefore increases exponentially,” explains Filippo Vicentini, researcher at the Center for Theoretical Physics (CPHT*). There are a multitude of these problems depending on the molecules and materials, including the understanding of superconductivity, where materials become perfectly conductive at low temperatures. To achieve these calculations, either exactly, but most often in an approximate manner, researchers have developed numerous methods over several decades, such as tensor networks, quantum Monte Carlo simulations, or even dynamic theory. medium field. And new methods continue to be developed.

The Science article presents a quantitative criterion for measuring the precision of methods and the difficulty of problems. The scientists in the collaboration looked at model systems, which capture the main ingredients of complex phenomena, such as the Hubbard model, used for example in the study of superconductivity. The goal is to find the minimum energy state of these systems. The criterion created by the collaboration, called V-score, is a combination of the value of this energy and the fluctuations around this value. “The closer we get to the exact solution, the smaller the fluctuations in the energy value. The V-score is correlated with the precision error of the methods used,” explains Filippo Vicentini.

The collaboration compiled a large amount of results and methods to test and validate this criterion, and then to identify the most complex problems. Schematically, the more spatial dimensions the models have, the more difficult the problem is at present. There are also so-called “frustrated” configurations of materials, under the influence of magnetic interaction between atoms in particular, which are particularly difficult. “This collaborative work, with a shared database, allows everyone in the community to have a clearer vision of the state of progress, without necessarily being a specialist in each of the sub-areas at stake,” hopes Filippo Vicentini. It would also be a way to guide research into new techniques towards problems that are difficult to solve, where they could have real added value. This is particularly the case for quantum algorithms, which are starting to emerge, but which are not yet as effective as current methods for revealing the properties of materials.

*CPHT: a CNRS mixed research unit, École polytechnique, Institut Polytechnique de , 91120 ,

Lien vers la publication scientifique: Variational benchmarks for quantum many-body problems, Science, 386, 6719, 2024. DOI: 10.1126/science.adg9774

Read the press release from EPFL, member of the collaboration.

-

Related News :