Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Search Publications by:

Search Title, Abstract, Conference, Citation, Keyword or Author
Displaying 1 - 3 of 3

Low-Rank Gradient Descent for Memory-Efficient Training of Deep In-Memory Arrays

May 18, 2023
Author(s)
Siyuan Huang, Brian Hoskins, Matthew Daniels, Mark Stiles, Gina C. Adam
The movement of large quantities of data during the training of a Deep Neural Network presents immense challenges for machine learning workloads. To minimize this overhead, espe- cially on the movement and calculation of gradient information, we introduce

A System for Validating Resistive Neural Network Prototypes

July 27, 2021
Author(s)
Brian Hoskins, Mitchell Fream, Matthew Daniels, Jonathan Goodwill, Advait Madhavan, Jabez J. McClelland, Osama Yousuf, Gina C. Adam, Wen Ma, Muqing Liu, Rasmus Madsen, Martin Lueker-Boden
Building prototypes of heterogeneous hardware systems based on emerging electronic, magnetic, and photonic devices is an increasingly important area of research. On the face of it, the novel implementation of these systems, especially for online learning

Streaming Batch Gradient Tracking for Neural Network Training

April 3, 2020
Author(s)
Siyuan Huang, Brian D. Hoskins, Matthew W. Daniels, Mark D. Stiles, Gina C. Adam
Faster and more energy efficient hardware accelerators are critical for machine learning on very large datasets. The energy cost of performing vector-matrix multiplication and repeatedly moving neural network models in and out of memory motivates a search