IMPROVING PERFORMANCE IN CONTINUAL LEARNING TASKS USING BIO-INSPIRED ARCHITECTURES

Sandeep Madireddy, Angel Yanguas-Gil, Prasanna Balaprakash

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

The ability to learn continuously from an incoming data stream without catastrophic forgetting is critical to designing intelligent systems. Many approaches to continual learning rely on stochastic gradient descent and its variants that employ global error updates, and hence need to adopt strategies such as memory buffers or replay to circumvent its stability, greed, and short-term memory limitations. To address this limitation, we have developed a biologically inspired lightweight neural network architecture that incorporates synaptic plasticity mechanisms and neuromodulation and hence learns through local error signals to enable online continual learning without stochastic gradient descent. Our approach leads to superior online continual learning performance on Split-MNIST, Split-CIFAR-10, and Split-CIFAR-100 datasets compared to other memory-constrained learning approaches and matches that of the state-of-the-art memory-intensive replay-based approaches. We further demonstrate the effectiveness of our approach by integrating key design concepts into other backpropagation-based continual learning algorithms, significantly improving their accuracy. Our results provide compelling evidence for the importance of incorporating biological principles into machine learning models and offer insights into how we can leverage them to design more efficient and robust systems for online continual learning.

Original languageEnglish
Pages (from-to)992-1008
Number of pages17
JournalProceedings of Machine Learning Research
Volume232
StatePublished - 2023
Event2nd Conference on Lifelong Learning Agents, CoLLA 2023 - Montreal, Canada
Duration: Aug 22 2023Aug 25 2023

Funding

Primary development of this work was funded by the DARPA Lifelong Learning Machines (L2M) Program. This work has also been partially supported by the DOE Office of Science, Advanced Scientific Computing Research, and SciDAC programs under Contract No. DE-AC02-06CH11357. The computational resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under the DE-AC02-06CH11357 contract, and the Laboratory Computing Resource Center (LCRC) at the Argonne National Laboratory have been utilized for this work.

FundersFunder number
Defense Advanced Research Projects Agency
Office of Science
Advanced Scientific Computing ResearchDE-AC02-06CH11357
Argonne National Laboratory
Laboratory Computing Resource Center

    Fingerprint

    Dive into the research topics of 'IMPROVING PERFORMANCE IN CONTINUAL LEARNING TASKS USING BIO-INSPIRED ARCHITECTURES'. Together they form a unique fingerprint.

    Cite this