Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Measurement-driven neural-network training for integrated magnetic tunnel junction arrays



William Borders, Advait Madhavan, Matthew Daniels, Vasileia Georgiou, Martin Lueker-Boden, Tiffany Santos, Patrick Braganca, Mark Stiles, Jabez J. McClelland, Brian Hoskins


The increasing scale of neural networks needed to support more complex applications has led to an increasing requirement for area- and energy-efficient hardware. One route to meeting the budget for these applications is to circumvent the von Neumann bottleneck by performing computation in or near memory. However, an inevitability of transferring neural networks onto hardware is the fact that nonidealities, such as device-to-device variations or poor device yield impact performance. Methods, such as hardware-aware training, where substrate nonidealities are incorporated during network training, are one way to recover performance at the cost of solution generality. In this work, we demonstrate inference on hardware-based neural networks consisting of 20 000 magnetic tunnel junction (MTJ) arrays integrated on CMOS chips in a form that closely resembles scalable and market-ready spin transfer-torque magnetoresistive random access memory (STT-MRAM) technology. Using 36 dies, each containing a MTJ-CMOS crossbar array with its own nonidealities, we show that even a small number of defects in physically mapped networks significantly degrades the performance of networks trained without defects and show that, at the cost of generality, hardware-aware training accounting for specific defects on each die can recover to comparable performance with ideal networks. We then demonstrate a robust training method that extends hardware-aware training to statistics-aware training, producing network weights that perform well on most defective dies regardless of their specific defect locations. When evaluated on the 36 physical dies, statistics-aware trained solutions can achieve a mean misclassification error on the MNIST dataset that differs from the software-baseline by only 2%. This statistics-aware training method could be generalized to networks with many layers that are mapped to hardware suited for industry-ready applications.
Physical review applied


Neuromorphic computing, magnetic tunnel junctions, machine learning


Borders, W. , Madhavan, A. , Daniels, M. , Georgiou, V. , Lueker-Boden, M. , Santos, T. , Braganca, P. , Stiles, M. , McClelland, J. and Hoskins, B. (2024), Measurement-driven neural-network training for integrated magnetic tunnel junction arrays, Physical review applied, [online],, (Accessed May 22, 2024)


If you have any questions about this publication or are having problems accessing it, please contact

Created May 14, 2024, Updated May 15, 2024