Abstract
Interdependence across time and length scales is common in biology, where atomic interactions can impact larger-scale phenomenon. Such dependence is especially true for a well-known cancer signaling pathway, where the membrane-bound RAS protein binds an effector protein called RAF. To capture the driving forces that bring RAS and RAF (represented as two domains, RBD and CRD) together on the plasma membrane, simulations with the ability to calculate atomic detail while having long time and large length- scales are needed. The Multiscale Machine-Learned Modeling Infrastructure (MuMMI) is able to resolve RAS/RAF protein-membrane interactions that identify specific lipid-protein fingerprints that enhance protein orientations viable for effector binding. MuMMI is a fully automated, ensemble-based multiscale approach connecting three resolution scales: (1) the coarsest scale is a continuum model able to simulate milliseconds of time for a 1 μm2 membrane, (2) the middle scale is a coarse-grained (CG) Martini bead model to explore protein-lipid interactions, and (3) the finest scale is an all-atom (AA) model capturing specific interactions between lipids and proteins. MuMMI dynamically couples adjacent scales in a pairwise manner using machine learning (ML). The dynamic coupling allows for better sampling of the refined scale from the adjacent coarse scale (forward) and on-the-fly feedback to improve the fidelity of the coarser scale from the adjacent refined scale (backward). MuMMI operates efficiently at any scale, from a few compute nodes to the largest supercomputers in the world, and is generalizable to simulate different systems. As computing resources continue to increase and multiscale methods continue to advance, fully automated multiscale simulations (like MuMMI) will be commonly used to address complex science questions.
Original language | English |
---|---|
Pages (from-to) | 2658-2675 |
Number of pages | 18 |
Journal | Journal of Chemical Theory and Computation |
Volume | 19 |
Issue number | 9 |
DOIs | |
State | Published - May 9 2023 |
Funding
This work was performed under the auspices of the U.S. Department of Energy (DOE) by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344, Los Alamos National Laboratory (LANL) under Contract DE-AC5206NA25396, Oak Ridge National Laboratory under Contract DE-AC05-00OR22725, Argonne National Laboratory (ANL) under Contract DE-AC02-06-CH11357, and under the auspices of the National Cancer Institute (NCI) by Frederick National Laboratory for Cancer Research (FNLCR) under Contract 75N91019D00024. This work has been supported by the Joint Design of Advanced Computing Solutions for Cancer (JDACS4C) program established by the U.S. DOE and the NCI of the National Institutes of Health. This research used resources of the Oak Ridge Leadership Computing Facility (OLCF), which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725. For computing time, the authors thank the Advanced Scientific Computing Research Leadership Computing Challenge (ALCC) for time on Summit, the Livermore Institutional Grand Challenge for time on Lassen and LANL Institutional computing. The LANL Institutional Computing Program is supported by the U.S. DOE National Nuclear Security Administration under Contract No. DE-AC52-06NA25396. For computing support, the authors thank OLCF and LC staff. Release: LLNL-JRNL-833009.
Funders | Funder number |
---|---|
National Institutes of Health | |
U.S. Department of Energy | |
National Cancer Institute | |
Office of Science | |
National Nuclear Security Administration | LLNL-JRNL-833009, DE-AC52-06NA25396 |
Argonne National Laboratory | DE-AC02-06-CH11357 |
Lawrence Livermore National Laboratory | DE-AC52-07NA27344 |
Oak Ridge National Laboratory | DE-AC05-00OR22725 |
Los Alamos National Laboratory | DE-AC5206NA25396 |
Frederick National Laboratory for Cancer Research | 75N91019D00024 |