The Simple Cloud-Resolving E3SM Atmosphere Model Running on the Frontier Exascale System

Mark Taylor, Peter M. Caldwell, Luca Bertagna, Conrad Clevenger, Aaron Donahue, James Foucar, Oksana Guba, Benjamin Hillman, Noel Keen, Jayesh Krishna, Matthew Norman, Sarat Sreepathi, Christopher Terai, James B. White, Andrew G. Salinger, Renata B. McCoy, Lai Yung Ruby Leung, David C. Bader, Danqing Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We present an efficient and performance portable implementation of the Simple Cloud Resolving E3SM Atmosphere Model (SCREAM). SCREAM is a full featured atmospheric global circulation model with a nonhydrostatic dynamical core and state-of-the-art parameterizations for microphysics, moist turbulence and radiation. It has been written from scratch in C++ with the Kokkos library used to abstract the on-node execution model for both CPUs and GPUs. SCREAM is one of only a few global atmosphere models to be ported to GPUs. As far as we know, SCREAM is the first such model to run on both AMD GPUs and NVIDIA GPUs, as well as the first to run on nearly an entire Exascale system (Frontier). On Frontier, we obtained a record setting performance of 1.26 simulated years per day for a realistic cloud resolving simulation.

Original languageEnglish
Title of host publicationProceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, SC 2023
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9798400701092
DOIs
StatePublished - Nov 12 2023
Event2023 International Conference for High Performance Computing, Networking, Storage and Analysis, SC 2023 - Denver, United States
Duration: Nov 12 2023Nov 17 2023

Publication series

NameProceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, SC 2023

Conference

Conference2023 International Conference for High Performance Computing, Networking, Storage and Analysis, SC 2023
Country/TerritoryUnited States
CityDenver
Period11/12/2311/17/23

Funding

This research was supported as part of the Energy Exascale Earth System Model (E3SM) project, funded by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research. Caldwell, Donahue, and Terai’s contributions were performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Sandia National Laboratories is a multi-mission laboratory managed and operated by the National Technology and Engineering Solutions of Sandia, L.L.C., a wholly owned subsidiary of Honey-well International, Inc., for the DOE’s National Nuclear Security Administration under contract DE-NA-0003525. SAND2020-0000. This manuscript has been co-authored by Oak Ridge National Laboratory, operated by UT-Battelle,LLC under Contract No. DE-AC05-00OR22725 with the U.S.Department of Energy. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. This research was supported by the Exascale Computing Project (17-SC-20-SC), a collaborative effort of two U.S. Department of Energy organizations (Office of Science and the National Nuclear Security Administration) responsible for the planning and preparation of a capable exascale ecosystem, including software, applications, hardware, advanced system engineering, and early testbed platforms, in support of the nation’s exascale computing imperative. This research used resources of the National Energy Research Scientific Computing Center (NERSC), a U.S. Department of Energy Office of Science User Facility operated under Contract No. DE-AC02-05CH11231.

Keywords

  • GPU
  • atmospheric modeling
  • exascale
  • global cloud resolving
  • high performance computing

Fingerprint

Dive into the research topics of 'The Simple Cloud-Resolving E3SM Atmosphere Model Running on the Frontier Exascale System'. Together they form a unique fingerprint.

Cite this