One of the most promising new approaches to next generation information processing is spintronics, where information is carried with electronic spin rather than charge. Among the spintronic devices that are being developed, magnetic tunnel junctions are particularly suited to implementing novel approaches to computing because they are multifunctional and compatible with standard integrated circuits. When they are operated near the thermal switching threshold, magnetic tunnel junctions exhibit complex stochastic behavior that is reminiscent of some aspects of brain activity. We are exploring the extent to which these novel functionalities can be applied to computing that emulates the brain.
Magnetic tunnel junctions (see Fig. 1) consist of two thin films of ferromagnetic material separated by a few atomic layers of an insulating material. The insulator is so thin that electrons can tunnel quantum mechanically through it. The rate at which the electrons tunnel is affected by the relative magnetic configuration of the two ferromagnetic layers. If the magnetizations in the two layers are parallel, it is easier to tunnel than if they are antiparallel. The resulting difference in resistance makes it straightforward to read the state of the magnetic layers using electronic circuits. This ease of reading the magnetic state is only one important feature of these devices. The other is the ability to change the state of the device by passing a current through it, creating a spin torque. Practical reading and current-control of magnetic tunnel junctions are key features that are enabling the realization of fast, dense, non-volatile memory integrated into complementary metal-oxide-semiconductor (CMOS) circuits in commercial applications today. They are also the basis of several proposed applications in novel computing schemes.
When the energy barrier that separates the parallel and antiparallel states in a magnetic tunnel junction becomes comparable to ambient thermal energy, the tunnel junction spontaneously flips back and forth between these two states (see Fig. 2). Such junctions are said to be superparamagnetic. As current passes through the device in one direction, the junction spends more time in one state and the flipping rate reduces. As current flows in the opposite direction, it spends more time in the other state and the flipping rate also reduces. The flipping rate is fastest at some intermediate current close to zero, where equal time is spent in each state. We are measuring the properties of such devices, how they respond to external inputs like periodic voltages and noise, and how the time varying resistance coupled with spin torques allows fluctuating devices to influence other such devices.
In the brain, some neurons emit their spikes with seemingly random patterns, encoding information in the average rate. Neuronal spikes operate with energies very close to the thermal limit, that is, normal thermal fluctuations can randomly create extra events or cause other events to be lost. This behavior suggests that designing computers to operate close to the thermal limit might increase their energy efficiency if they can be designed so that they are resilient to this thermal noise. A resilient computer of this kind could function using much less power. Inspired by this, we are exploring ways that superparamagnetic magnetic tunnel junctions can be used in this context.
One approach we are taking is referred to as population coding. The flipping rate of a magnetic tunnel junction as a function of current looks very similar to the response of some neurons to external stimuli. This suggests that the value of an external stimulus can be efficiently encoded in the flipping rate of a population of magnetic tunnel junction neurons. We have shown how implementing this kind of approach can lead to very robust and energy efficient computing for tasks like guiding a robotic arm toward an observed target.
Another computational approach that can be implemented with superparamagnetic tunnel junctions is stochastic computing. In stochastic computing, all numbers are between zero and one and are represented not as binary numbers but by random bitstreams generated with a probability corresponding to the number they represent. We have designed a low energy bitstream generator based on superparamagnetic tunnel junctions. It not only consumes less energy than equivalent generators based on just CMOS circuitry, it also generates truly random bitstreams unlike those based on CMOS circuitry. We show how such bitstream generators can be combined with CMOS circuitry to carry out stochastic computing to identify hand-written digits effectively and efficiently (see Fig. 3).
Some collections of neurons spike in a periodic pattern in response to certain stimuli. The coupling between such oscillations and sympathetic oscillations in other parts of the brain is thought to be an efficient way of processing sets of stimuli. This behavior suggests that it might be efficient to compute with sets of connected oscillators. Such sets form recurrent networks, that is, networks where the output not only depends on the input, but also the current state of the network. Such networks can be extremely efficient processors for time-dependent tasks, like speech recognition.
Appropriately shaped magnetic tunnel junctions oscillate when current flows through them. Their ultimate scaling suggests that such oscillators could be among the most energy efficient and take up the least area in an integrated circuit. We have demonstrated that such nanoscale oscillators can act as recurrent networks by using them in a demonstration of reservoir computing. With the development of efficient ways to couple groups of such oscillators, they could form the basis of extremely efficient computers for tasks like voice and video recognition.