Off-the-shelf deep learning is not enough, and requires parsimony, Bayesianity, and causality

Rama K. Vasudevan, Maxim Ziatdinov, Lukas Vlcek, Sergei V. Kalinin

Research output: Contribution to journalReview articlepeer-review

38 Scopus citations

Abstract

Deep neural networks (‘deep learning’) have emerged as a technology of choice to tackle problems in speech recognition, computer vision, finance, etc. However, adoption of deep learning in physical domains brings substantial challenges stemming from the correlative nature of deep learning methods compared to the causal, hypothesis driven nature of modern science. We argue that the broad adoption of Bayesian methods incorporating prior knowledge, development of solutions with incorporated physical constraints and parsimonious structural descriptors and generative models, and ultimately adoption of causal models, offers a path forward for fundamental and applied research.

Original languageEnglish
Article number16
Journalnpj Computational Materials
Volume7
Issue number1
DOIs
StatePublished - Dec 2021
Externally publishedYes

Funding

The work was supported by the U.S. Department of Energy, Office of Science, Materials Sciences and Engineering Division (S.V.K., L.V., R. K. V.). Research was conducted at the Center for Nanophase Materials Sciences, which also provided support (M.Z.) and is a US DOE Office of Science User Facility.

Fingerprint

Dive into the research topics of 'Off-the-shelf deep learning is not enough, and requires parsimony, Bayesianity, and causality'. Together they form a unique fingerprint.

Cite this