Energy efficient stochastic-based deep spiking neural networks for sparse datasets

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations

Abstract

With large deep neural networks (DNNs) necessary to solve complex and data-intensive problems, energy efficiency is a key bottleneck for effectively deploying DL in the real world. Deep spiking NNs have gained much research attention recently due to the interest in building biological neural networks and the availability of neuromorphic platforms, which can be orders of magnitude more energy efficient compared to CPUs and GPUs. Although spiking NNs have proven to be an efficient technique for solving many machine learning and computer vision problems, to the best of our knowledge, this is the first attempt to adapt spiking NNs to sparse datasets. In this paper, we study the behaviour of spiking NNs in handling NLP datasets and the sparsity in their data representation. Then, we propose a novel framework for spiking NN using the concept of stochastic computing. Specifically, instead of generating spike trains with firing rates proportional to the intensity of each value in the feature set separately, the whole feature set is treated as a distribution function and a stochastic spiking train that follow this distribution is generated. This framework reduces the connectivity between NN layers from O(N) to O(log N). Also, it encodes input data differently and make suitable to handle sparse datasets. Finally, the framework achieves high energy efficiency since it uses Integrate and Fire neurons same as conventional spiking NNs. The results show that our proposed stochastic-based SNN achieves nearly the same accuracy as the original DNN on MNIST dataset, and it has better performance than state-of-the-art SNN. Besides that stochastic-based SNN is energy efficient, where the fully connected DNN, the conventional SNN, and the data normalized SNN consume 38.24, 1.83, and 1.85-times more energy than the stochastic-based SNN, respectively. For sparse datasets, including IMDb and In-House clinical datasets, stochastic-based SNN achieves performance comparable to that of the conventional DNN. However, the conventional spiking NN has a significant decline in classification accuracy.

Original languageEnglish
Title of host publicationProceedings - 2017 IEEE International Conference on Big Data, Big Data 2017
EditorsJian-Yun Nie, Zoran Obradovic, Toyotaro Suzumura, Rumi Ghosh, Raghunath Nambiar, Chonggang Wang, Hui Zang, Ricardo Baeza-Yates, Ricardo Baeza-Yates, Xiaohua Hu, Jeremy Kepner, Alfredo Cuzzocrea, Jian Tang, Masashi Toyoda
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages311-318
Number of pages8
ISBN (Electronic)9781538627143
DOIs
StatePublished - Jul 1 2017
Event5th IEEE International Conference on Big Data, Big Data 2017 - Boston, United States
Duration: Dec 11 2017Dec 14 2017

Publication series

NameProceedings - 2017 IEEE International Conference on Big Data, Big Data 2017
Volume2018-January

Conference

Conference5th IEEE International Conference on Big Data, Big Data 2017
Country/TerritoryUnited States
CityBoston
Period12/11/1712/14/17

Funding

This work has been supported in part by the Joint Design of Advanced Computing Solutions for Cancer (JDACS4C) program established by the U.S. Department of Energy (DOE) and the National Cancer Institute (NCI) of the National Institutes of Health. This work was performed under the auspices of the U.S. Department of Energy by Argonne National Laboratory under Contract DE-AC02-06-CH11357, Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, Los Alamos National Laboratory under Contract DE-AC5206NA25396, and Oak Ridge National Laboratory under Contract DE-AC05-00OR2272. This work has been also supported by the Laboratory Directed Research and Development (LDRD) program of Oak Ridge National Laboratory, under LDRD project No. 8339. The authors wish to thank Valentina Petkov of the Surveillance Research Program from the National Cancer Institute and the SEER registries at HI, KY, CT, NM and Seattle for the de-identified pathology reports used in this investigation.

FundersFunder number
National Institutes of Health
U.S. Department of Energy
National Cancer Institute
Argonne National LaboratoryDE-AC02-06-CH11357
Lawrence Livermore National Laboratory
Oak Ridge National LaboratoryDE-AC05-00OR2272
Laboratory Directed Research and Development8339
Los Alamos National LaboratoryDE-AC5206NA25396

    Keywords

    • Natural Language Processing
    • Spiking Neural Network
    • Stochastic Computing

    Fingerprint

    Dive into the research topics of 'Energy efficient stochastic-based deep spiking neural networks for sparse datasets'. Together they form a unique fingerprint.

    Cite this