TY - JOUR
T1 - Transferring predictions of formation energy across lattices of increasing size
AU - Lupo Pasini, Massimiliano
AU - Karabin, Mariia
AU - Eisenbach, Markus
N1 - Publisher Copyright:
© 2024 The Author(s). Published by IOP Publishing Ltd.
PY - 2024/6/1
Y1 - 2024/6/1
N2 - In this study, we show the transferability of graph convolutional neural network (GCNN) predictions of the formation energy of the nickel-platinum solid solution alloy across atomic structures of increasing sizes. The original dataset was generated with the large-scale atomic/molecular massively parallel simulator using the second nearest-neighbor modified embedded-atom method empirical interatomic potential. Geometry optimization was performed on the initially randomly generated face centered cubic crystal structures and the formation energy has been calculated at each step of the geometry optimization, with configurations spanning the whole compositional range. Using data from various steps of the geometry optimization, we first trained our open-source, scalable implementation of GCNN called HydraGNN on a lattice of 256 atoms, which accounts well for the short-range interactions. Using this data, we predicted the formation energy for lattices of 864 atoms and 2048 atoms, which resulted in lower-than-expected accuracy due to the long-range interactions present in these larger lattices. We accounted for the long-range interactions by including a small amount of training data representative for those two larger sizes, whereupon the predictions of HydraGNN scaled linearly with the size of the lattice. Therefore, our strategy ensured scalability while reducing significantly the computational cost of training on larger lattice sizes.
AB - In this study, we show the transferability of graph convolutional neural network (GCNN) predictions of the formation energy of the nickel-platinum solid solution alloy across atomic structures of increasing sizes. The original dataset was generated with the large-scale atomic/molecular massively parallel simulator using the second nearest-neighbor modified embedded-atom method empirical interatomic potential. Geometry optimization was performed on the initially randomly generated face centered cubic crystal structures and the formation energy has been calculated at each step of the geometry optimization, with configurations spanning the whole compositional range. Using data from various steps of the geometry optimization, we first trained our open-source, scalable implementation of GCNN called HydraGNN on a lattice of 256 atoms, which accounts well for the short-range interactions. Using this data, we predicted the formation energy for lattices of 864 atoms and 2048 atoms, which resulted in lower-than-expected accuracy due to the long-range interactions present in these larger lattices. We accounted for the long-range interactions by including a small amount of training data representative for those two larger sizes, whereupon the predictions of HydraGNN scaled linearly with the size of the lattice. Therefore, our strategy ensured scalability while reducing significantly the computational cost of training on larger lattice sizes.
KW - atomistic materials modeling
KW - formation energy
KW - graph neural networks
KW - solid solution alloys
UR - http://www.scopus.com/inward/record.url?scp=85190762531&partnerID=8YFLogxK
U2 - 10.1088/2632-2153/ad3d2c
DO - 10.1088/2632-2153/ad3d2c
M3 - Article
AN - SCOPUS:85190762531
SN - 2632-2153
VL - 5
JO - Machine Learning: Science and Technology
JF - Machine Learning: Science and Technology
IS - 2
M1 - 025015
ER -