Abstract
The new generation of heterogeneous CPU/GPU computer systems offer much greater computational performance but are not yet widely used for climate modeling. One reason for this is that traditional climate models were written before GPUs were available and would require an extensive overhaul to run on these new machines. In addition, even conventional “high–resolution” simulations don't currently provide enough parallel work to keep GPUs busy, so the benefits of such overhaul would be limited for the types of simulations climate scientists are accustomed to. The vision of the Simple Cloud-Resolving Energy Exascale Earth System (E3SM) Atmosphere Model (SCREAM) project is to create a global atmospheric model with the architecture to efficiently use GPUs and horizontal resolution sufficient to fully take advantage of GPU parallelism. After 5 years of model development, SCREAM is finally ready for use. In this paper, we describe the design of this new code, its performance on both CPU and heterogeneous machines, and its ability to simulate real-world climate via a set of four 40 day simulations covering all 4 seasons of the year.
Original language | English |
---|---|
Article number | e2024MS004314 |
Journal | Journal of Advances in Modeling Earth Systems |
Volume | 16 |
Issue number | 7 |
DOIs | |
State | Published - Jul 2024 |
Funding
This research was supported as part of the Energy Exascale Earth System Model (E3SM) project ( https://e3sm.org/ ), funded by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research. Data were used from the U.S. Department of Energy (DOE) ARM Climate Research Facilities at Southern Great Plains (SGP), ENA, NSA, TWP, and the field campaign Green Ocean Amazon (GoAmazon). Yunyan Zhang was supported by the US DOE Atmospheric Systems Research program Tying in High Resolution E3SM with ARM Data (THREAD) project. Jingjing Tian was supported by the US DOE Early Career Research Project awarded to Y. Zhang. This research was supported by the Exascale Computing Project (17\u2010SC\u201020\u2010SC), a collaborative effort of two U.S. Department of Energy organizations (Office of Science and the National Nuclear Security Administration) responsible for the planning and preparation of a capable exascale ecosystem, including software, applications, hardware, advanced system engineering, and early testbed platforms, in support of the nation's exascale computing imperative. This paper was prepared under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE\u2010AC52\u201007NA27344. IM Release number LLNL\u2010JRNL\u2010859924. Sandia National Laboratories is a multi\u2010mission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC (NTESS), a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy's National Nuclear Security Administration (DOE/NNSA) under contract DE\u2010NA0003525. This written work is authored by an employee of NTESS. The employee, not NTESS, owns the right, title and interest in and to the written work and is responsible for its contents. Any subjective views or opinions that might be expressed in the written work do not necessarily represent the views of the U.S. Government. The publisher acknowledges that the U.S. Government retains a non\u2010exclusive, paid\u2010up, irrevocable, world\u2010wide license to publish or reproduce the published form of this written work or allow others to do so, for U.S. Government purposes. The DOE will provide public access to results of federally sponsored research in accordance with the DOE Public Access Plan. This paper describes objective technical results and analysis. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE\u2010AC02\u201005CH11231. This manuscript has been co\u2010authored by Oak Ridge National Laboratory, operated by UT\u2010Battelle,LLC under Contract No. DEAC05\u201000OR22725 with the U.S. Department of Energy. An award of computer time was also provided by the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, performed using resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE\u2010AC05\u201000OR22725. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy under Contract DE\u2010AC05\u201076RLO1830. The Work at BNL was supported by the Energy Exascale Earth System Model (E3SM) project funded by Department of Energy under contract DE\u2010SC0012704. Argonne National Laboratory's work was supported by the U.S. Department of Energy, Assistant Secretary for Environmental Management, Office of Science, under contract DE\u2010AC02\u201006CH11357. This research was supported as part of the Energy Exascale Earth System Model (E3SM) project (https://e3sm.org/), funded by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research. Data were used from the U.S. Department of Energy (DOE) ARM Climate Research Facilities at Southern Great Plains (SGP), ENA, NSA, TWP, and the field campaign Green Ocean Amazon (GoAmazon). Yunyan Zhang was supported by the US DOE Atmospheric Systems Research program Tying in High Resolution E3SM with ARM Data (THREAD) project. Jingjing Tian was supported by the US DOE Early Career Research Project awarded to Y. Zhang. This research was supported by the Exascale Computing Project (17-SC-20-SC), a collaborative effort of two U.S. Department of Energy organizations (Office of Science and the National Nuclear Security Administration) responsible for the planning and preparation of a capable exascale ecosystem, including software, applications, hardware, advanced system engineering, and early testbed platforms, in support of the nation's exascale computing imperative. This paper was prepared under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. IM Release number LLNL-JRNL-859924. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC (NTESS), a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy's National Nuclear Security Administration (DOE/NNSA) under contract DE-NA0003525. This written work is authored by an employee of NTESS. The employee, not NTESS, owns the right, title and interest in and to the written work and is responsible for its contents. Any subjective views or opinions that might be expressed in the written work do not necessarily represent the views of the U.S. Government. The publisher acknowledges that the U.S. Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this written work or allow others to do so, for U.S. Government purposes. The DOE will provide public access to results of federally sponsored research in accordance with the DOE Public Access Plan. This paper describes objective technical results and analysis. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. This manuscript has been co-authored by Oak Ridge National Laboratory, operated by UT-Battelle,LLC under Contract No. DEAC05-00OR22725 with the U.S. Department of Energy. An award of computer time was also provided by the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, performed using resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy under Contract DE-AC05-76RLO1830. The Work at BNL was supported by the Energy Exascale Earth System Model (E3SM) project funded by Department of Energy under contract DE-SC0012704. Argonne National Laboratory's work was supported by the U.S. Department of Energy, Assistant Secretary for Environmental Management, Office of Science, under contract DE-AC02-06CH11357.
Keywords
- E3SM
- GPUs
- Kokkos
- cloud-resolving scales
- diurnal cycle
- exascale
- global atmosphere model
- global climate model
- global storm-resolving model
- heterogeneous computing
- high-resolution global model