Enhancing high-fidelity neural network potentials through low-fidelity sampling

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The efficacy of neural network potentials (NNPs) critically depends on the quality of the configurational datasets used for training. Prior research using empirical potentials has shown that well-selected liquid–solid transitional configurations of a metallic system can be translated to other metallic systems. This study demonstrates that such validated configurations can be relabeled using density functional theory (DFT) calculations, thereby enhancing the development of high-fidelity NNPs. Training strategies and sampling approaches are efficiently assessed using empirical potentials and subsequently relabeled via DFT in a highly parallelized fashion for high-fidelity NNP training. Our results reveal that relying solely on energy and force for NNP training is inadequate to prevent overfitting, highlighting the necessity of incorporating stress terms into the loss functions. To optimize training involving force and stress terms, we propose employing transfer learning to fine-tune the weights, ensuring that the potential surface is smooth for these quantities composed of energy derivatives. This approach markedly improves the accuracy of elastic constants derived from simulations in both empirical potential-based NNPs and relabeled DFT-based NNPs. Overall, this study offers significant insights into leveraging empirical potentials to expedite the development of reliable and robust NNPs at the DFT level.

Original languageEnglish
Article number046102
JournalAPL Machine Learning
Volume2
Issue number4
DOIs
StatePublished - Dec 1 2024

Funding

The author acknowledges support from the Laboratory Directed Research and Development Program (LDRD) of the Oak Ridge National Laboratory (Enzyme Initiative and partly AI Initiative), managed by UT-Battelle, LLC, for the U.S. Department of Energy under Contract No. DEAC05-00OR22725. This research used resources from the Compute and Data Environment for Science (CADES) at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725 and the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility for access to additional supercomputing resources. This work was a part of a user project at the Center for Nanophase Materials Sciences (CNMS), a US Department of Energy, Office of Science User Facility at the Oak Ridge National Laboratory.

Fingerprint

Dive into the research topics of 'Enhancing high-fidelity neural network potentials through low-fidelity sampling'. Together they form a unique fingerprint.

Cite this