(definition)

**Definition:**
A theoretical measure of the execution of an *algorithm*, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = o(g(n)) means f(n) becomes insignificant relative to g(n) as n approaches infinity. The notation is read, "f of n is little oh of g of n".

**Formal Definition:** f(n) = o(g(n)) means for all c > 0 there exists some k > 0 such that 0 ≤ f(n) < cg(n) for all n ≥ k. The value of k must not depend on n, but may depend on c.

**Generalization** (I am a kind of ...)

*big-O notation*.

**See also**
*ω(n)*.

*Note:
As an example, 3n + 4 is o(n²) since for any c we can choose k > (3+ √(9+16c))/2c. 3n + 4 is not o(n). o(f(n)) is an upper bound, but is not an asymptotically tight bound. *

* This is lower case "o", not the lower case Greek letter omicron. See the note at big-O notation. *

Author: PEB

Little o is a Landau Symbol.

Go to the Dictionary of Algorithms and Data Structures home page.

If you have suggestions, corrections, or comments, please get in touch with Paul Black.

Entry modified 6 September 2019.

HTML page formatted Tue Oct 1 16:08:00 2019.

Cite this as:

Paul E. Black, "little-o notation", in
*Dictionary of Algorithms and Data Structures* [online], Paul E. Black, ed. 6 September 2019. (accessed TODAY)
Available from: https://www.nist.gov/dads/HTML/littleOnotation.html