Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Multiplexed gradient descent: Fast online training of modern datasets on hardware neural networks without backpropagation

Published

Author(s)

Adam McCaughan, Bakhrom Oripov, Natesh Ganesh, Sae Woo Nam, Andrew Dienstfrey, Sonia Buckley

Abstract

We show that model-free perturbative methods can be used to efficiently train modern neural network architectures in a way that can be directly applied to emerging neuromorphic hardware. These methods were investigated for training VLSI neural networks beginning in the 1990s, and more recently on memristive crossbars and photonic hardware, but all these demonstrations have been very limited in scale, comprising small datasets with only a few neurons. We describe a framework for applying these techniques to existing neuromorphic hardware at much larger scales, with an emphasis on creating simple, highly-localized circuits that could be implemented on-chip if desired. The framework is also extensible to training existing hardware systems via a chip-in-the-loop technique.
Citation
APL Machine Learning

Keywords

machine learning, neuromorphic

Citation

McCaughan, A. , Oripov, B. , Ganesh, N. , Nam, S. , Dienstfrey, A. and Buckley, S. (2023), Multiplexed gradient descent: Fast online training of modern datasets on hardware neural networks without backpropagation, APL Machine Learning, [online], https://doi.org/10.1063/5.0157645, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=936062 (Accessed April 28, 2024)
Created June 26, 2023, Updated October 19, 2023