TY - GEN
T1 - A Software Framework for Comparing Training Approaches for Spiking Neuromorphic Systems
AU - Schuman, Catherine D.
AU - Plank, James S.
AU - Parsa, Maryam
AU - Kulkarni, Shruti R.
AU - Skuda, Nicholas
AU - Mitchell, J. Parker
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - There are a wide variety of training approaches for spiking neural networks for neuromorphic deployment. However, it is often not clear how these training algorithms perform or compare when applied across multiple neuromorphic hardware platforms and multiple datasets. In this work, we present a software framework for comparing performance across four neuromorphic training algorithms across three neuromorphic simulators and four simple classification tasks. We introduce an approach for training a spiking neural network using a decision tree, and we compare this approach to training algorithms based on evolutionary algorithms, back-propagation, and reservoir computing. We present a hyperparameter optimization approach to tune the hyperparameters of the algorithm, and show that these optimized hyperparameters depend on the processor, algorithm, and classification task. Finally, we compare the performance of the optimized algorithms across multiple metrics, including accuracy, training time, and resulting network size, and we show that there is not one best training algorithm across all datasets and performance metrics.
AB - There are a wide variety of training approaches for spiking neural networks for neuromorphic deployment. However, it is often not clear how these training algorithms perform or compare when applied across multiple neuromorphic hardware platforms and multiple datasets. In this work, we present a software framework for comparing performance across four neuromorphic training algorithms across three neuromorphic simulators and four simple classification tasks. We introduce an approach for training a spiking neural network using a decision tree, and we compare this approach to training algorithms based on evolutionary algorithms, back-propagation, and reservoir computing. We present a hyperparameter optimization approach to tune the hyperparameters of the algorithm, and show that these optimized hyperparameters depend on the processor, algorithm, and classification task. Finally, we compare the performance of the optimized algorithms across multiple metrics, including accuracy, training time, and resulting network size, and we show that there is not one best training algorithm across all datasets and performance metrics.
KW - decision trees
KW - genetic algorithms
KW - neuromorphic computing
KW - spiking neural networks
UR - http://www.scopus.com/inward/record.url?scp=85116508377&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9533881
DO - 10.1109/IJCNN52387.2021.9533881
M3 - Conference contribution
AN - SCOPUS:85116508377
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
Y2 - 18 July 2021 through 22 July 2021
ER -