Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

A general approach to fast online training of modern datasets on real neuromorphic systems without backpropagation

Published

Author(s)

Sonia Buckley, Adam McCaughan

Abstract

We present parameter-multiplexed gradient descent (PMGD), a perturbative gradient descent framework designed to easily train emergent neuromorphic hardware platforms. We show its applicability to both analog and digital systems. We demonstrate how to use it to train networks with modern machine learning datasets, including Fashion-MNIST and CIFAR-10. Assuming realistic timescales and hardware parameters, our results indicate that PMGD could train a network on emerging hardware platforms orders of magnitude faster than the wall-clock time of training via backpropagation on a standard GPU/CPU.
Proceedings Title
Proceedings of the International Conference on Neuromorphic Systems
Conference Dates
July 27-30, 2022
Conference Location
Knoxville, TN, US
Conference Title
International Conference on Neuromorphic Systems

Keywords

machine learning, neural networks, neuromorphic computing, emerging hardware

Citation

Buckley, S. and McCaughan, A. (2022), A general approach to fast online training of modern datasets on real neuromorphic systems without backpropagation, Proceedings of the International Conference on Neuromorphic Systems, Knoxville, TN, US, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=934846 (Accessed May 29, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created September 7, 2022, Updated March 1, 2023