Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Layer ensemble averaging for fault tolerance in memristive neural networks

Published

Author(s)

Osama Yousuf, Brian Hoskins, Karthick Ramu, Mitchell Fream, William Borders, Advait Madhavan, Matthew Daniels, Andrew Dienstfrey, Jabez McClelland, Martin Lueker-Boden, Gina Adam

Abstract

Advancements in continual learning with artificial neural networks have been fueled in large part by scaling network dimensionalities. As this scaling continues, conventional computing systems are becoming increasingly inefficient due to the von Neumann bottleneck, and thus alternative in-memory computation architectures have become an important area of research. Emerging memory technologies such as memristors are a promising candidate to circumvent this problem, but device non-idealities hinder the performance of neural networks based on memristive crossbars compared to their software counterparts, especially in the context of continual learning inference at the edge. This work proposes and experimentally demonstrates layer ensemble averaging – a technique to map pre-trained neural network solutions from software to defective hardware crossbars of emerging memory devices. The approach is investigated in the context of a continual learning problem with a custom 20,000-device hardware prototyping platform, and its effectiveness is studied in simulation as well as in experiment using a defective resistive random-access memory (ReRAM) crossbar chip. Results highlight that by trading off the number of devices required for layer mapping, layer ensemble averaging can reliably boost defective memristive network performance up to the software baseline. For the investigated continual learning problem, the average multi-task classification accuracy improves from 61% (lower than a linear solver) to 72% (< 1% of software baseline) on average with hardware layer ensembles of size 3.
Citation
Nature Communications
Volume
16
Issue
1

Citation

Yousuf, O. , Hoskins, B. , Ramu, K. , Fream, M. , Borders, W. , Madhavan, A. , Daniels, M. , Dienstfrey, A. , McClelland, J. , Lueker-Boden, M. and Adam, G. (2025), Layer ensemble averaging for fault tolerance in memristive neural networks, Nature Communications, [online], https://doi.org/10.1038/s41467-025-56319-6, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=957710 (Accessed March 23, 2025)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created February 1, 2025, Updated February 5, 2025