Machine Learning Hamiltonian by A Renormalization Group Approach
2022.11.16 12:34 - Katarzyna KuźniarFor a long time, physicists have determined Hamiltonian for a given problem by following a bottom-up approach starting from physical principles such as symmetry. Nowadays, thanks to a large amount of data available, it has become possible to estimate Hamiltonian by machine learning, i.e., inferring it from a large dataset of observations from experiments or simulations. The majority of physical systems are characterized by a probability distribution. At thermal equilibrium, this distribution has the Boltzmann-Gibbs form. Estimating such probability distributions and generating new samples efficently is now a major endeavor at the center of intense research activity. In this study, we develop a multiscale approach to estimate high-dimensional probability distributions and associated Hamiltonians from a dataset of physical fields or configurations. In this way, we can estimate Hamiltonian and efficiently generate new samples of many-body systems in various domains, from statistical physics to cosmology. Our method uses the idea of the renormalization group, which proceeds scale by scale, estimating models for the conditional probabilities of ”fast degrees of freedom” conditioned by coarse-grained fields. This approach has a large number of potential applications in equilibrium and non-equilibrium systems where the underlying distribution is not known a priori. We verified our method for the Gaussian and phi^4 field models and mass distributions in astrophysics [1].
[1] Tanguy Marchand, Misaki Ozawa, Giulio Biroli, and Stéphane Mallat, arXiv:2207.04941.
Załącznik | Wielkość |
---|---|
seminarium_nomaten_22-11-2022.docx | 16.29 KB |
seminarium_nomaten_22-11-2022.pdf | 724.52 KB |