NIST

ω

(definition)

Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = ω (g(n)) means g(n) becomes insignificant relative to f(n) as n goes to infinity.

Formal Definition: f(n) = ω (g(n)) means that for any positive constant c, there exists a constant k, such that 0 ≤ cg(n) < f(n) for all n ≥ k. The value of k must not depend on n, but may depend on c.

See also Ω(n), little-o notation, big-O notation.

Note: This is the Greek letter Omega.

Author: PEB


Go to the Dictionary of Algorithms and Data Structures home page.

If you have suggestions, corrections, or comments, please get in touch with Paul Black.

Entry modified 29 November 2004.
HTML page formatted Wed Mar 13 12:42:45 2019.

Cite this as:
Paul E. Black, "ω", in Dictionary of Algorithms and Data Structures [online], Paul E. Black, ed. 29 November 2004. (accessed TODAY) Available from: https://www.nist.gov/dads/HTML/omega.html