Bayesian Multi-objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design

Maryam Parsa, John P. Mitchell, Catherine D. Schuman, Robert M. Patton, Thomas E. Potok, Kaushik Roy

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

In resource-constrained environments, such as low-power edge devices and smart sensors, deploying a fast, compact, and accurate intelligent system with minimum energy is indispensable. Embedding intelligence can be achieved using neural networks on neuromorphic hardware. Designing such networks would require determining several inherent hyperparameters. A key challenge is to find the optimum set of hyperparameters that might belong to the input/output encoding modules, the neural network itself, the application, or the underlying hardware. In this work, we present a hierarchical pseudo agent-based multi-objective Bayesian hyperparameter optimization framework (both software and hardware) that not only maximizes the performance of the network, but also minimizes the energy and area requirements of the corresponding neuromorphic hardware. We validate performance of our approach (in terms of accuracy and computation speed) on several control and classification applications on digital and mixed-signal (memristor-based) neural accelerators. We show that the optimum set of hyperparameters might drastically improve the performance of one application (i.e., 52–71% for Pole-Balance), while having minimum effect on another (i.e., 50–53% for RoboNav). In addition, we demonstrate resiliency of different input/output encoding, training neural network, or the underlying accelerator modules in a neuromorphic system to the changes of the hyperparameters.

Original languageEnglish
Article number667
JournalFrontiers in Neuroscience
Volume14
DOIs
StatePublished - Jul 21 2020

Funding

Funding. This research was funded in part by Center for Brain Inspired Computing Enabling Autonomous Intelligence (C-BRIC), one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program sponsored by DARPA, the National Science Foundation, Intel Corporation and Vannevar Bush Faculty Fellowship, U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under contract number DE-AC05-00OR22725, and by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory. The funders were not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication.

FundersFunder number
C-BRIC
Vannevar Bush Faculty Fellowship
National Science Foundation
U.S. Department of Energy
Semiconductor Research Corporation
Defense Advanced Research Projects Agency
Intel Corporation
Office of Science
Advanced Scientific Computing ResearchDE-AC05-00OR22725
Oak Ridge National Laboratory

    Keywords

    • Bayesian optimization
    • accurate and energy-efficient machine learning
    • multi-objective hyperparameter optimization
    • neuromorphic computing
    • spiking neural networks

    Fingerprint

    Dive into the research topics of 'Bayesian Multi-objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design'. Together they form a unique fingerprint.

    Cite this