1. Overview

In this tutorial, we’ll study the difference between the lower the tight bounds for algorithmic complexity.

2. Bachmann–Landau Notation

Asymptotic notation, also called Bachmann–Landau notation, is a common way to express the time or space complexity of an algorithm. Its basic idea is that the behavior of a function or an algorithm f(x), as its input x, tends to some limit, becomes predictable, and its relationship with other asymptotic functions also does.

When we study algorithms, not functions, we simply replace x with n. The latter then corresponds to the size of the input that’s provided to the algorithm. In that case, we generally study the asymptotic behavior of the algorithm as n approaches +\infty, in relation to some other algorithms or functions.

3. Size of the Input and Bounds

The algorithm’s input normally consists of a list, a database, a variable, a graph, or some other type of data structure. In relation to its size, we want to know if there are any maximum or minimum time limits for an algorithm executed on a given input. We call these limits:

  • lower bound, the minimum guaranteed time
  • upper bound, the maximum guaranteed time

Simple programs with simple inputs have computational times and corresponding limits that are easy to study. For them, we can learn through experimentation what is their actual computational time and thus formulate an idea regarding their lower and upper bounds. For more complex programs, however, this isn’t necessarily the case.

Of course, the simplest possible program of all is the empty program, which runs in no time regardless of the size of the input. For an empty program, the least and the most time required, as a function of the input’s size, correspond perfectly. For all non-trivial cases, though, this perfect correspondence isn’t guaranteed, and we, therefore, have to discriminate between the minimum and the maximum guaranteed computational times.

4. Big-O Notation

The Big-O notation defines the upper bound of an algorithm. If an algorithm has an upper bound \textbf{O(f(n))}, this means that it’s guaranteed to execute in \textbf{f(n)} times some constant at most, even in the worst-case scenario.

As an example, the time complexity of merge sort is O(n \times\text{log} (n)). Heap sort also has a complexity of O(n \times\text{log} (n)), making the two forms equivalent in the worst case. Linear search, instead, has a computational complexity of O(n), which makes it expectedly faster than the previous two.

Notice however that, if an algorithm has upper bound \textbf{O(f(n))}, there’s no guarantee that \textbf{f(n)} is the best approximation of the superior limit of the computational time. There may be in fact an approximation that’s even better than the one we currently use.

For example, if a certain algorithm is upper bound by O(n^{10}), we can also say that it’s also upper bound by O(n^{100}). The former, understandably, is a much better approximation than the latter, but the statement is still valid.

5. Big Omega Notation

On the other opposite, we find the lower bound, which we indicate with a Big Omega symbol. We say that an algorithm has minimal computational time \boldsymbol{\Omega}\textbf{(f(n))} if its execution requires at least \textbf{f(n)} operations.

For example, all algorithms have a trivial minimum computational time of \Omega(0). Also, linear search has a best-case scenario of \Omega(1), which happens if the element we’re searching is the first in the list.

Instead, parsing all characters in a string requires \Omega(n) for a string of size n. This is because the termination condition is reached only when the last character is met. It also requires n operations at most, but more on that next.

6. Big Theta Notation

Finally, on the basis of the definitions for O and \Omega, we can also define the big Theta notation \Theta. An algorithm has computational complexity \Theta(f(n)) if and only if its lower bound complexity is \Omega(f(n)) and its upper bound is O(f(n)). With formal notation, we can then say that:

\Theta(f(n)) \leftrightarrow (\Omega(f(n)) \wedge O(f(n)))

We noted in the previous section that parsing all characters in a string of size n is lower-bound by \Omega(n), and upper-bound by O(n). As a consequence, we can say that this algorithm is tightly bound. Therefore, we can simply say that parsing a string has a computational complexity of \Theta(n).

7. Conclusion

In this article, we studied the difference between \Omega notation for lower bounds and the \Theta notation for tight bounds.

A tight bound implies that both the lower and the upper bound for the computational complexity of an algorithm are the same.

Comments are closed on this article!