NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.
Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.
An official website of the United States government
Here’s how you know
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS
A lock (
) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.
Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions
Published
Author(s)
Jonathan Goodwill, Nitin Prasad, Brian Hoskins, Matthew Daniels, Advait Madhavan, Lei Wan, Tiffany Santos, Michael Tran, Jordan Katine, Patrick Braganca, Mark Stiles, Jabez J. McClelland
Abstract
The increasing scale of neural networks and their growing application space have produced a demand for more energy and memory efficient artificial-intelligence-specific hardware. Avenues to mitigate the main issue, the von Neumann bottleneck, include in-memory and near-memory architectures, as well as algorithmic approaches. Here we leverage low-power and the inherently binary operation of magnetic tunnel junctions (MTJs) to realize the first neural network hardware inference accelerator based on passive arrays of MTJs. In general, transferring a trained network model to hardware for inference is confronted by degradation in performance due to nonidealities in the substrate, parasitic resistance, device to device variations, and write errors. To quantify the effect of device non-idealities, we benchmark 300 unique weight matrix solutions of a 2-layer perceptron to classify the Wine dataset for both classification accuracy and write fidelity. Despite device imperfections, we achieve software-equivalent accuracy of up to 95.3 % with proper tuning of network parameters in 15 × 15 MTJ arrays having a range of device sizes. The success of this tuning process shows that new metrics are needed to characterize the performance and quality of networks reproduced in mixed signal hardware.
Goodwill, J.
, Prasad, N.
, Hoskins, B.
, Daniels, M.
, Madhavan, A.
, Wan, L.
, Santos, T.
, Tran, M.
, Katine, J.
, Braganca, P.
, Stiles, M.
and McClelland, J.
(2022),
Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions, Physical Review Applied, [online], https://doi.org/10.1103/PhysRevApplied.18.014039, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=933173
(Accessed October 14, 2025)