Accurate and Accelerated Neuromorphic Network Design Leveraging A Bayesian Hyperparameter Pareto Optimization Approach

Maryam Parsa, Catherine Schuman, Nitin Rathi, Amir Ziabari, Derek Rose, J. Parker Mitchell, J. Travis Johnston, Bill Kay, Steven Young, Kaushik Roy

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

Neuromorphic systems allow for extremely efficient hardware implementations for neural networks (NNs). In recent years, several algorithms have been presented to train spiking NNs (SNNs) for neuromorphic hardware. However, SNNs often provide lower accuracy than their artificial NNs (ANNs) counterparts or require computationally expensive and slow training/inference methods. To close this gap, designers typically rely on reconfiguring SNNs through adjustments in the neuron/synapse model or training algorithm itself. Nevertheless, these steps incur significant design time, while still lacking the desired improvement in terms of training/inference times (latency). Designing SNNs that can mimic the accuracy of ANNs with reasonable training times is an exigent challenge in neuromorphic computing. In this work, we present an alternative approach that looks at such designs as an optimization problem rather than algorithm or architecture redesign. We develop a versatile multiobjective hyperparameter optimization (HPO) for automatically tuning HPs of two state-of-The-Art SNN training algorithms, SLAYER and HYBRID. We emphasize that, to the best of our knowledge, this is the first work trying to improve SNNs' computational efficiency, accuracy, and training time using an efficient HPO. We demonstrate significant performance improvements for SNNs on several datasets without the need to redesign or invent new training algorithms/architectures. Our approach results in more accurate networks with lower latency and, in turn, higher energy efficiency than previous implementations. In particular, we demonstrate improvement in accuracy and more than 5 × reduction in the training/inference time for the SLAYER algorithm on the DVS Gesture dataset. In the case of HYBRID, we demonstrate 30% reduction in timesteps while surpassing the accuracy of the state-of-The-Art networks on CIFAR10. Further, our analysis suggests that even a seemingly minor change in HPs could change the accuracy by 5-6 ×.

Original languageEnglish
Title of host publicationICONS 2021 - Proceedings of International Conference on Neuromorphic Systems 2021
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450386913
DOIs
StatePublished - Jul 27 2021
Event2021 International Conference on Neuromorphic Systems, ICONS 2021 - Virtual, Onlie, United States
Duration: Jul 27 2021Jul 29 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference2021 International Conference on Neuromorphic Systems, ICONS 2021
Country/TerritoryUnited States
CityVirtual, Onlie
Period07/27/2107/29/21

Funding

Notice: This manuscript has been authored in part by Center for Brain Inspired Computing Enabling Autonomous Intelligence (C-BRIC), one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program sponsored by DARPA, the National Science Foundation, Intel Corporation and Vannevar Bush Faculty Fellowship, and UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan). This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under contract number DE-AC05-00OR22725, and in part by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC. The research is also funded in part by Center for Brain Inspired Computing Enabling Autonomous Intelligence (C-BRIC), one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program sponsored by DARPA, the National Science Foundation, Intel Corporation and Vannevar Bush Faculty Fellowship.

FundersFunder number
C-BRIC
Vannevar Bush Faculty Fellowship
National Science Foundation
U.S. Department of Energy
Semiconductor Research Corporation
Defense Advanced Research Projects Agency
Intel Corporation
Office of Science
Advanced Scientific Computing ResearchDE-AC05-00OR22725
Oak Ridge National Laboratory
UT-Battelle

    Keywords

    • Bayesian optimization
    • Neuromorphic computing
    • hyperparameter optimization
    • multiobjective optimization
    • spiking neural networks

    Fingerprint

    Dive into the research topics of 'Accurate and Accelerated Neuromorphic Network Design Leveraging A Bayesian Hyperparameter Pareto Optimization Approach'. Together they form a unique fingerprint.

    Cite this