Transfer-learning-based Autotuning using Gaussian Copula

Thomas Randall, Jaehoon Koo, Brice Videau, Michael Kruse, Xingfu Wu, Paul Hovland, Mary Hall, Rong Ge, Prasanna Balaprakash

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

As diverse high-performance computing (HPC) systems are built, many opportunities arise for applications to solve larger problems than ever before. Given the significantly increased complexity of these HPC systems and application tuning, empirical performance tuning, such as autotuning, has emerged as a promising approach in recent years. Despite its effectiveness, autotuning is often a computationally expensive approach. Transfer learning (TL)-based autotuning seeks to address this issue by leveraging the data from prior tuning. Current TL methods for autotuning spend significant time modeling the relationship between parameter configurations and performance, which is ineffective for few-shot (that is, few empirical evaluations) tuning on new tasks. We introduce the first generative TL-based autotuning approach based on the Gaussian copula (GC) to model the high-performing regions of the search space from prior data and then generate high-performing configurations for new tasks. This allows a sampling-based approach that maximizes few-shot performance and provides the first probabilistic estimation of the few-shot budget for effective TL-based autotuning. We compare our generative TL approach with state-of-the-art autotuning techniques on several benchmarks. We find that the GC is capable of achieving 64.37% of peak few-shot performance in its first evaluation. Furthermore, the GC model can determine a few-shot transfer budget that yields up to 33.39× speedup, a dramatic improvement over the 20.58× speedup using prior techniques.

Original languageEnglish
Title of host publicationACM ICS 2023 - Proceedings of the International Conference on Supercomputing
PublisherAssociation for Computing Machinery
Pages37-49
Number of pages13
ISBN (Electronic)9798400700569
DOIs
StatePublished - Jun 21 2023
Event37th ACM International Conference on Supercomputing, ICS 2023 - Orlando, United States
Duration: Jun 21 2023Jun 23 2023

Publication series

NameProceedings of the International Conference on Supercomputing

Conference

Conference37th ACM International Conference on Supercomputing, ICS 2023
Country/TerritoryUnited States
CityOrlando
Period06/21/2306/23/23

Funding

This research was partially supported by the Exascale Computing Project (17-SC-20-SC), a collaborative effort of the U.S. Department of Energy Office of Science and the National Nuclear Security Administration, and by U.S. National Science Foundation under Grants CCF-1942182. This material is based upon work supported by the U.S. Department of Energy, Office of Science, under contract number DE-AC02-06CH11357.

Keywords

  • autotuning
  • few-shot learning
  • gaussian copula
  • transfer learning

Fingerprint

Dive into the research topics of 'Transfer-learning-based Autotuning using Gaussian Copula'. Together they form a unique fingerprint.

Cite this