A scalable algorithm for the optimization of neural network architectures

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We propose a new scalable method to optimize the architecture of an artificial neural network. The proposed algorithm, called Greedy Search for Neural Network Architecture, aims to determine a neural network with minimal number of layers that is at least as performant as neural networks of the same structure identified by other hyperparameter search algorithms in terms of accuracy and computational cost. Numerical results performed on benchmark datasets show that, for these datasets, our method outperforms state-of-the-art hyperparameter optimization algorithms in terms of attainable predictive performance by the selected neural network architecture, and time-to-solution for the hyperparameter optimization to complete.

Original languageEnglish
Article number102788
JournalParallel Computing
Volume104-105
DOIs
StatePublished - Jul 2021

Bibliographical note

Publisher Copyright:
© 2021 Elsevier B.V.

Keywords

  • Adaptive algorithms
  • Deep learning
  • Greedy constructive algorithms
  • Hyperparameter optimization
  • Neural network architecture
  • Random search

Fingerprint

Dive into the research topics of 'A scalable algorithm for the optimization of neural network architectures'. Together they form a unique fingerprint.

Cite this