Combining multitask and transfer learning with deep Gaussian processes for autotuning-based performance engineering

Piotr Luszczek, Wissam M. Sid-Lakhdar, Jack Dongarra

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

We combine deep Gaussian processes (DGPs) with multitask and transfer learning for the performance modeling and optimization of HPC applications. Deep Gaussian processes merge the uncertainty quantification advantage of Gaussian processes (GPs) with the predictive power of deep learning. Multitask and transfer learning allow for improved learning efficiency when several similar tasks are to be learned simultaneously and when previous learned models are sought to help in the learning of new tasks, respectively. A comparison with state-of-the-art autotuners shows the advantage of our approach on two application problems. In this article, we combine DGPs with multitask and transfer learning to allow for both an improved tuning of an application parameters on problems of interest but also the prediction of parameters on any potential problem the application might encounter.

Original languageEnglish
Pages (from-to)229-244
Number of pages16
JournalInternational Journal of High Performance Computing Applications
Volume37
Issue number3-4
DOIs
StatePublished - Jul 2023

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The surrogate performance model portion of this work was supported by the U.S. Department of Energy, Office of Science, and ASCR under Award Number DE-SC0021419. The development of some of the software libraries tested in this work was supported by the National Science Foundation Office of Advanced Cyberinfrastructure Directorate for Comp. & Info. Sci. & Eng. under Grant No. 2004541. This research and software used in this work was also supported by the Exascale Computing Project (17-SC-20-SC), a collaborative effort of the U.S. Department of Energy Office of Science and the National Nuclear Security Administration, and by the U.S. Department of Energy and Office of Science, under Contracts DE-AC05-00OR22725 and DE-AC52-07NA27344.

Keywords

  • Efficient Global Optimization
  • Gaussian process regression
  • Latin Hypercube Sampling
  • Linear Coregionalization Model
  • performance autotuning

Fingerprint

Dive into the research topics of 'Combining multitask and transfer learning with deep Gaussian processes for autotuning-based performance engineering'. Together they form a unique fingerprint.

Cite this