NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.
Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.
An official website of the United States government
Here’s how you know
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS
A lock (
) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.
Experimental demonstration of a robust training method for strongly defective neuromorphic hardware
Published
Author(s)
William Borders, Advait Madhavan, Matthew Daniels, Vasileia Georgiou, Martin Lueker-Boden, Tiffany Santos, Patrick Braganca, Mark Stiles, Jabez J. McClelland, Brian Hoskins
Abstract
Neural networks are increasing in scale and sophistication, catalyzing the need for efficient hardware. An inevitability when transferring neural networks to hardware is that non-idealities impact performance. Hardware-aware training, where non-idealities are accounted for during training is one way to recover performance, but at the cost of generality. In this work, we demonstrate a binary neural network consisting of an array of 20,000 magnetic tunnel junctions (MTJ) integrated on complementary metal-oxide-semiconductor (CMOS) chips. With 36 dies, we show that even a few defects can degrade the performance of neural networks. We demonstrate hardware-aware training and show that performance recovers close to ideal networks. We then introduce a robust method – statistics-aware training – that compensates for defects regardless of their specific configuration. When evaluated on the MNIST dataset, statistics-aware solutions differ from software-baselines by only 2 %.
Proceedings Title
Will not elect to publish in proceedings
The 35th Magnetic Recording Conference
Conference Dates
August 5-7, 2024
Conference Location
Berkeley, CA, US
Pub Type
Conferences
Keywords
Neuromorphic computing, magnetic tunnel junctions, neural network training, in-memory computing
Borders, W.
, Madhavan, A.
, Daniels, M.
, Georgiou, V.
, Lueker-Boden, M.
, Santos, T.
, Braganca, P.
, Stiles, M.
, McClelland, J.
and Hoskins, B.
(2023),
Experimental demonstration of a robust training method for strongly defective neuromorphic hardware, Will not elect to publish in proceedings
The 35th Magnetic Recording Conference, Berkeley, CA, US
(Accessed October 1, 2025)