Efficacy of using a dynamic length representation vs. a fixed-length for neuroarchitecture search

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Deep learning neuroarchitecture and hyperparameter search are important in finding the best configuration that maximizes learned model accuracy. However, the number of types of layers, their associated hyperparameters, and the myriad of ways to connect layers poses a significant computational challenge in discovering ideal model configurations. Here, we assess two different approaches for neuroarchitecture search for a LeNet style neural network, one that uses a fixed-length approach where there is a preset number of possible layers that can be toggled on or off via mutation, and a variable-length approach where layers can be freely added or removed via special mutation operators. We found that the variable-length implementation trained better models while discovering unusual layer configurations worth further exploration.This manuscript has been authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).

Original languageEnglish
Title of host publicationGECCO 2024 Companion - Proceedings of the 2024 Genetic and Evolutionary Computation Conference Companion
PublisherAssociation for Computing Machinery, Inc
Pages1888-1894
Number of pages7
ISBN (Electronic)9798400704956
DOIs
StatePublished - Jul 14 2024
Event2024 Genetic and Evolutionary Computation Conference Companion, GECCO 2024 Companion - Melbourne, Australia
Duration: Jul 14 2024Jul 18 2024

Publication series

NameGECCO 2024 Companion - Proceedings of the 2024 Genetic and Evolutionary Computation Conference Companion

Conference

Conference2024 Genetic and Evolutionary Computation Conference Companion, GECCO 2024 Companion
Country/TerritoryAustralia
CityMelbourne
Period07/14/2407/18/24

Funding

This work used resources of the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725. This research used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DEAC02- 05CH11231.

Fingerprint

Dive into the research topics of 'Efficacy of using a dynamic length representation vs. a fixed-length for neuroarchitecture search'. Together they form a unique fingerprint.

Cite this