Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation

Steffen Schotthöfer, Tianbai Xiao, Martin Frank, Cory D. Hauck

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

In this paper, we explore applications of deep learning in statistical physics. We choose the Boltzmann equation as a typical example, where neural networks serve as a closure to its moment system. We present two types of neural networks to embed the convexity of entropy and to preserve the minimum entropy principle and intrinsic mathematical structures of the moment system of the Boltzmann equation. We derive an error bound for the generalization gap of convex neural networks which are trained in Sobolev norm and use the results to construct data sampling methods for neural network training. Numerical experiments demonstrate that the neural entropy closure is significantly faster than classical optimizers while maintaining sufficient accuracy.

Original languageEnglish
Pages (from-to)19406-19433
Number of pages28
JournalProceedings of Machine Learning Research
Volume162
StatePublished - 2022
Event39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States
Duration: Jul 17 2022Jul 23 2022

Funding

The work of Steffen Schotthöfer, Tianbai Xiao and Martin Frank funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) in the frame of the priority programme SPP 2298”Theoretical Foundations of Deep Learning” - Projectnumber 441826958. The work of Cory Hauck is sponsored by the Office of Advanced Scientific Computing Research, U.S. Department of Energy, and performed at the Oak Ridge National Laboratory, which is managed by UT-Battelle, LLC under Contract No. De-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan). The work of Steffen Schotthöfer, Tianbai Xiao and Martin Frank funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) in the frame of the priority programme SPP 2298 ”Theoretical Foundations of Deep Learning” – Projectnumber 441826958. The work of Cory Hauck is sponsored by the Office of Advanced Scientific Computing Research, U.S. Department of Energy, and performed at the Oak Ridge National Laboratory, which is managed by UT-Battelle, LLC under Contract No. De-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).

FundersFunder number
DOE Public Access Plan
Deutsche Forschungsgemeinschaft
German Research Foundation441826958
Office of Advanced Scientific Computing Research
U.S. Department of Energy
United States Government
U.S. Department of EnergyDe-AC05-00OR22725
Advanced Scientific Computing Research
Deutsche Forschungsgemeinschaft

    Fingerprint

    Dive into the research topics of 'Structure Preserving Neural Networks: A Case Study in the Entropy Closure of the Boltzmann Equation'. Together they form a unique fingerprint.

    Cite this