Energy-efficient stochastic computing with superparamagnetic tunnel junctions
Matthew W. Daniels, Advait Madhavan, Philippe Talatchian, Alice Mizrahi, Mark D. Stiles
Stochastic computing has been limited by the inaccuracies introduced by correlations between the pseudorandom bitstreams used in the calculation. We hybridize a stochastic version of magnetic tunnel junctions with basic CMOS logic gates to create a SuperPARamagnetic-Tunnel- junction-Array Network (Spartan) that drastically reduces the impact of correlations. Superparamagnetic tunnel junctions implement low energy, truly random bitstreams. We exploit this true randomness by using OR-gate-based neurons to provide both the summation and nonlinear operation of a neuron with superior energy efficiency. Simulating Spartan on a commonly implemented convolutional neural network structure allows for a systematic evaluation of accuracy and energy. The present iteration of Spartan achieves 97.2~\% accuracy on MNIST while consuming only 70~nJ per inference---an order of magnitude less energy than previous stochastic implementations.