NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.
Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.
An official website of the United States government
Here’s how you know
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS
A lock (
) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.
A general approach to fast online training of modern datasets on real neuromorphic systems without backpropagation
Published
Author(s)
Sonia Buckley, Adam McCaughan
Abstract
We present parameter-multiplexed gradient descent (PMGD), a perturbative gradient descent framework designed to easily train emergent neuromorphic hardware platforms. We show its applicability to both analog and digital systems. We demonstrate how to use it to train networks with modern machine learning datasets, including Fashion-MNIST and CIFAR-10. Assuming realistic timescales and hardware parameters, our results indicate that PMGD could train a network on emerging hardware platforms orders of magnitude faster than the wall-clock time of training via backpropagation on a standard GPU/CPU.
Proceedings Title
Proceedings of the International Conference on Neuromorphic Systems
Buckley, S.
and McCaughan, A.
(2022),
A general approach to fast online training of modern datasets on real neuromorphic systems without backpropagation, Proceedings of the International Conference on Neuromorphic Systems, Knoxville, TN, US, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=934846
(Accessed October 8, 2025)