Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Gaussian Variant of Freivalds' Algorithm for Efficient and Reliable Matrix Product Verification

Published

Author(s)

Michael V. Mascagni, Hao Ji, Yaohang Li

Abstract

In this article, we consider the general problem of checking the correctness of matrix multiplication. Given three n×n matrices A, B, and C, the goal is to verify that A×B=C without carrying out the computationally costly operations of matrix multiplication and comparing the product A×B with C, term by term. This is especially important when some or all of these matrices are very large, and when the computing environment is prone to soft errors. Here we extend Freivalds' algorithm to a Gaussian Variant of Freivalds' Algorithm (GVFA) by projecting the product A×B as well as C onto a Gaussian random vector and then comparing the resulting vectors. The computational complexity of GVFA is consistent with that of Freivalds' algorithm, which is O(n2). However, unlike Freivalds' algorithm, whose probability of a false positive is 2-k, where k is the number of iterations. Our theoretical analysis shows that when A×B not equalC, the GVFA produces a false positive on set of inputs of measure zero with exact arithmetic. When we introduce round-off error and floating point arithmetic into our analysis, we can show that the larger this error, the higher the probability that GVFA avoids false positives. Moreover, by iterating GVFA k times, the probability of a false positive decreases as pk, where p is a very small value depending on the nature of the fault on the result matrix and the arithmetic system's floating-point precision. Unlike deterministic algorithms, there do not exist any fault patterns that are completely undetectable with GVFA. Thus GVFA can be used to provide efficient fault tolerance in numerical linear algebra, and it can be efficiently implemented on modern computing architectures.
Citation
Siam Journal on Computing
Volume
abs/1705.10449

Keywords

Fault-tolerance, Algorithmic Resilience, Gaussian Variant of Freivalds' Algorithm, Matrix Multiplication, Gaussian Random Vector, Failure Probability

Citation

Mascagni, M. , Ji, H. and Li, Y. (2017), Gaussian Variant of Freivalds' Algorithm for Efficient and Reliable Matrix Product Verification, Siam Journal on Computing, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=918706 (Accessed November 3, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created May 29, 2017, Updated September 25, 2020