TY - GEN
T1 - Neuromorphic acceleration for approximate Bayesian inference on neural networks via permanent dropout
AU - Wycoff, Nathan
AU - Balaprakash, Prasanna
AU - Xia, Fangfang
N1 - Publisher Copyright:
© 2019 Association for Computing Machinery.
PY - 2019/7/23
Y1 - 2019/7/23
N2 - As neural networks have begun performing increasingly critical tasks for society ranging from driving cars to identifying candidates for drug development, the value of their ability to perform uncertainty quantification (UQ) in their predictions has risen commensurately. Permanent dropout, a popular method for neural network UQ, involves injecting stochasticity into the inference phase of the model and creating many predictions for each of the test data. This shifts the computational and energy burden of deep neural networks from the training phase to the inference phase. Recent work has demonstrated near-lossless conversion of classical deep neural networks to their spiking counterparts. We use these results to demonstrate the feasibility of conducting the inference phase with permanent dropout on spiking neural networks, mitigating the technique's computational and energy burden, which is essential for its use at scale or on edge platforms. We demonstrate the proposed approach via the Nengo spiking neural simulator on a combination drug therapy dataset for cancer treatment, where UQ is critical. Our results indicate that the spiking approximation gives a predictive distribution practically indistinguishable from that given by the classical network.
AB - As neural networks have begun performing increasingly critical tasks for society ranging from driving cars to identifying candidates for drug development, the value of their ability to perform uncertainty quantification (UQ) in their predictions has risen commensurately. Permanent dropout, a popular method for neural network UQ, involves injecting stochasticity into the inference phase of the model and creating many predictions for each of the test data. This shifts the computational and energy burden of deep neural networks from the training phase to the inference phase. Recent work has demonstrated near-lossless conversion of classical deep neural networks to their spiking counterparts. We use these results to demonstrate the feasibility of conducting the inference phase with permanent dropout on spiking neural networks, mitigating the technique's computational and energy burden, which is essential for its use at scale or on edge platforms. We demonstrate the proposed approach via the Nengo spiking neural simulator on a combination drug therapy dataset for cancer treatment, where UQ is critical. Our results indicate that the spiking approximation gives a predictive distribution practically indistinguishable from that given by the classical network.
KW - Bayesian inference
KW - Neuromorphic computing
KW - Uncertainty quantification
UR - http://www.scopus.com/inward/record.url?scp=85073228161&partnerID=8YFLogxK
U2 - 10.1145/3354265.3354274
DO - 10.1145/3354265.3354274
M3 - Conference contribution
AN - SCOPUS:85073228161
T3 - ACM International Conference Proceeding Series
BT - ICONS 2019 - Proceedings of International Conference on Neuromorphic Systems
PB - Association for Computing Machinery
T2 - 2019 International Conference on Neuromorphic Systems, ICONS 2019
Y2 - 23 July 2019 through 25 July 2019
ER -