big oh

In the study of computational complexity, functions can be ranked according to how fast they grow.

Consider the polynomials, such as x, x2, x3, ... Here the variable x is in the base.

Faster growing than the polynomials are the exponential functions, like ex. Here the variable is in the exponent. If you remember your calculus, the nth derivative of an nth degree polynomial is always a constant, while the nth derivative of ex is always ex, regardless of how large n is.

Faster growing than the exponential functions are those of the form xx. Here the variable is in both the base and the exponent. Examples include the factorial function x!, which gets very big, very fast.

Conversely, the logarithmic functions grow more slowly than any polynomial. And functions of the form log(log x) grow even more slowly than log functions, and so on.

To measure how fast a function grows, we use a tool called O(x) (read as "big Oh of x").


Let f(x) and g(x) be defined on some subset of ℝ.
Then f(x) ∈ O(g(x)) as x → ∞ iff ∃x0 > 0, ∃M > 0,
such that |f(x)| ≤ M|g(x)| for all x > x0.

Example: Let f(x) = ax + b and g(x) = x.

Observe that |ax + b| ≤ |ax| + |b|. Select x0 = 1. Then we have |ax + b| ≤ |a|x + |b|, for all x > 0

≤ |a|x + |b|x, for all x > 1

≤ (|a| + |b|)x

cactuspear home
comments to comments at cactuspear dot org