TY - GEN
T1 - Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
AU - Parsa, Maryam
AU - Kulkarni, Shruti R.
AU - Coletti, Mark
AU - Bassett, Jeffrey
AU - Mitchell, J. Parker
AU - Schuman, Catherine D.
N1 - Publisher Copyright:
© 2021 IEEE
PY - 2021
Y1 - 2021
N2 - Neuroevolution has had significant success over recent years, but there has been relatively little work applying neuroevolution approaches to spiking neural networks (SNNs). SNNs are a type of neural networks that include temporal processing component, are not easily trained using other methods because of their lack of differentiable activation functions, and can be deployed into energy-efficient neuromorphic hardware. In this work, we investigate two evolutionary approaches for training SNNs. We explore the impact of the hyperparameters of the evolutionary approaches, including tournament size, population size, and representation type, on the performance of the algorithms. We present a multi-objective Bayesian-based hyperparameter optimization approach to tune the hyperparameters to produce the most accurate and smallest SNNs. We show that the hyperparameters can significantly affect the performance of these algorithms. We also perform sensitivity analysis and demonstrate that every hyperparameter value has the potential to perform well, assuming other hyperparameter values are set correctly.
AB - Neuroevolution has had significant success over recent years, but there has been relatively little work applying neuroevolution approaches to spiking neural networks (SNNs). SNNs are a type of neural networks that include temporal processing component, are not easily trained using other methods because of their lack of differentiable activation functions, and can be deployed into energy-efficient neuromorphic hardware. In this work, we investigate two evolutionary approaches for training SNNs. We explore the impact of the hyperparameters of the evolutionary approaches, including tournament size, population size, and representation type, on the performance of the algorithms. We present a multi-objective Bayesian-based hyperparameter optimization approach to tune the hyperparameters to produce the most accurate and smallest SNNs. We show that the hyperparameters can significantly affect the performance of these algorithms. We also perform sensitivity analysis and demonstrate that every hyperparameter value has the potential to perform well, assuming other hyperparameter values are set correctly.
KW - Evolutionary algorithms
KW - Neuromorphic computing
KW - Spiking neural networks
UR - http://www.scopus.com/inward/record.url?scp=85124602177&partnerID=8YFLogxK
U2 - 10.1109/CEC45853.2021.9504897
DO - 10.1109/CEC45853.2021.9504897
M3 - Conference contribution
AN - SCOPUS:85124602177
T3 - 2021 IEEE Congress on Evolutionary Computation, CEC 2021 - Proceedings
SP - 1225
EP - 1232
BT - 2021 IEEE Congress on Evolutionary Computation, CEC 2021 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE Congress on Evolutionary Computation, CEC 2021
Y2 - 28 June 2021 through 1 July 2021
ER -