Agebo-Tabular: Joint neural architecture and hyperparameter search with autotuned data-parallel training for tabular data

Romain Egele, Prasanna Balaprakash, Isabelle Guyon, Venkatram Vishwanath, Fangfang Xia, Rick Stevens, Zhengying Liu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Scopus citations

Abstract

Developing high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks with different architectures concurrently to automatically discover an high performing model. A key issue in NAS, particularly for large data sets, is the large computation time required to evaluate each generated architecture. While data-parallel training has the potential to address this issue, a straightforward approach can result in significant loss of accuracy. To that end, we develop AgEBO-Tabular, which combines Aging Evolution (AE) to search over neural architectures and asynchronous Bayesian optimization (BO) to search over hyperparameters to adapt data-parallel training. We evaluate the efficacy of our approach on two large predictive modeling tabular data sets from the Exascale Computing Project-CANcer Distributed Learning Environment (ECP-CANDLE).

Original languageEnglish
Title of host publicationProceedings of SC 2021
Subtitle of host publicationThe International Conference for High Performance Computing, Networking, Storage and Analysis: Science and Beyond
PublisherIEEE Computer Society
ISBN (Electronic)9781450384421
DOIs
StatePublished - Nov 14 2021
Externally publishedYes
Event33rd International Conference for High Performance Computing, Networking, Storage and Analysis: Science and Beyond, SC 2021 - Virtual, Online, United States
Duration: Nov 14 2021Nov 19 2021

Publication series

NameInternational Conference for High Performance Computing, Networking, Storage and Analysis, SC
ISSN (Print)2167-4329
ISSN (Electronic)2167-4337

Conference

Conference33rd International Conference for High Performance Computing, Networking, Storage and Analysis: Science and Beyond, SC 2021
Country/TerritoryUnited States
CityVirtual, Online
Period11/14/2111/19/21

Funding

This material is based upon work supported by the U.S. Department of Energy (DOE), Office of Science, Office of Advanced Scientific Computing Research, under Contract DE-AC02-06CH11357. This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility.

FundersFunder number
U.S. Department of Energy
Office of Science
Advanced Scientific Computing ResearchDE-AC02-06CH11357

    Keywords

    • Data-parallelism
    • Neural architecture search
    • neural networks

    Fingerprint

    Dive into the research topics of 'Agebo-Tabular: Joint neural architecture and hyperparameter search with autotuned data-parallel training for tabular data'. Together they form a unique fingerprint.

    Cite this