Machine Learning-Driven Multiscale Modeling: Bridging the Scales with a Next-Generation Simulation Infrastructure

Helgi I. Ingólfsson, Harsh Bhatia, Fikret Aydin, Tomas Oppelstrup, Cesar A. López, Liam G. Stanton, Timothy S. Carpenter, Sergio Wong, Francesco Di Natale, Xiaohua Zhang, Joseph Y. Moon, Christopher B. Stanley, Joseph R. Chavez, Kien Nguyen, Gautham Dharuman, Violetta Burns, Rebika Shrestha, Debanjan Goswami, Gulcin Gulten, Que N. VanArvind Ramanathan, Brian Van Essen, Nicolas W. Hengartner, Andrew G. Stephen, Thomas Turbyville, Peer Timo Bremer, S. Gnanakaran, James N. Glosli, Felice C. Lightstone, Dwight V. Nissley, Frederick H. Streitz

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

Interdependence across time and length scales is common in biology, where atomic interactions can impact larger-scale phenomenon. Such dependence is especially true for a well-known cancer signaling pathway, where the membrane-bound RAS protein binds an effector protein called RAF. To capture the driving forces that bring RAS and RAF (represented as two domains, RBD and CRD) together on the plasma membrane, simulations with the ability to calculate atomic detail while having long time and large length- scales are needed. The Multiscale Machine-Learned Modeling Infrastructure (MuMMI) is able to resolve RAS/RAF protein-membrane interactions that identify specific lipid-protein fingerprints that enhance protein orientations viable for effector binding. MuMMI is a fully automated, ensemble-based multiscale approach connecting three resolution scales: (1) the coarsest scale is a continuum model able to simulate milliseconds of time for a 1 μm2 membrane, (2) the middle scale is a coarse-grained (CG) Martini bead model to explore protein-lipid interactions, and (3) the finest scale is an all-atom (AA) model capturing specific interactions between lipids and proteins. MuMMI dynamically couples adjacent scales in a pairwise manner using machine learning (ML). The dynamic coupling allows for better sampling of the refined scale from the adjacent coarse scale (forward) and on-the-fly feedback to improve the fidelity of the coarser scale from the adjacent refined scale (backward). MuMMI operates efficiently at any scale, from a few compute nodes to the largest supercomputers in the world, and is generalizable to simulate different systems. As computing resources continue to increase and multiscale methods continue to advance, fully automated multiscale simulations (like MuMMI) will be commonly used to address complex science questions.

Original languageEnglish
Pages (from-to)2658-2675
Number of pages18
JournalJournal of Chemical Theory and Computation
Volume19
Issue number9
DOIs
StatePublished - May 9 2023

Funding

This work was performed under the auspices of the U.S. Department of Energy (DOE) by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344, Los Alamos National Laboratory (LANL) under Contract DE-AC5206NA25396, Oak Ridge National Laboratory under Contract DE-AC05-00OR22725, Argonne National Laboratory (ANL) under Contract DE-AC02-06-CH11357, and under the auspices of the National Cancer Institute (NCI) by Frederick National Laboratory for Cancer Research (FNLCR) under Contract 75N91019D00024. This work has been supported by the Joint Design of Advanced Computing Solutions for Cancer (JDACS4C) program established by the U.S. DOE and the NCI of the National Institutes of Health. This research used resources of the Oak Ridge Leadership Computing Facility (OLCF), which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725. For computing time, the authors thank the Advanced Scientific Computing Research Leadership Computing Challenge (ALCC) for time on Summit, the Livermore Institutional Grand Challenge for time on Lassen and LANL Institutional computing. The LANL Institutional Computing Program is supported by the U.S. DOE National Nuclear Security Administration under Contract No. DE-AC52-06NA25396. For computing support, the authors thank OLCF and LC staff. Release: LLNL-JRNL-833009.

FundersFunder number
National Institutes of Health
U.S. Department of Energy
National Cancer Institute
Office of Science
National Nuclear Security AdministrationLLNL-JRNL-833009, DE-AC52-06NA25396
Argonne National LaboratoryDE-AC02-06-CH11357
Lawrence Livermore National LaboratoryDE-AC52-07NA27344
Oak Ridge National LaboratoryDE-AC05-00OR22725
Los Alamos National LaboratoryDE-AC5206NA25396
Frederick National Laboratory for Cancer Research75N91019D00024

    Fingerprint

    Dive into the research topics of 'Machine Learning-Driven Multiscale Modeling: Bridging the Scales with a Next-Generation Simulation Infrastructure'. Together they form a unique fingerprint.

    Cite this