Big Oh And Relations

Recent changes
Table of contents
Links to this page


My latest posts can be found here:
Previous blog posts:
Additionally, some earlier writings:


In an earlier post we saw that the function $f(n)=n$ dominates $g(n)=n-\ln(n)$, but that's no surprise. What does come as a surprise is that $g$ dominates $f$. So even though for any value of $n$ greater than $2$ we already have $f(n)>g(n)$, still we say that $g$ dominates $f$.

That feels very unexpected, and as a result you might question the value of the concept of "dominates", so let's pause and take a moment to get a handle on what's going on.

Firstly, the fact that $b^n$ dominates $n^k$ for any $b>1$ and $k>0$ feels right and reasonable. After all, we talk about "Exponential Growth" and know that it's fast. Really fast. So the fact that exponential growth dominates polynomial growth is expected.

And it's also reasonable that polynomial growth never dominates exponential growth. Stands to reason. There's a hierarchy, and exponential growth is "higher up" than polynomial growth.

But what about $f$ and $g$ above? How can each "dominate" the other in this technical sense? We've looked at the technical definition, we know that according to that we have $n-\ln(n)$ dominates $n$, which in turn dominates $n-\ln(n)$. How can we get a handle on that?

In short, we are asking about the relationships between functions.

Equivalent functions

From our previous post we have the idea of a "Mathematical Relation", and in particular, we have the concept of an "Equivalence Relation." Let's define a particular relation between functions. We'll say that given two functions, $f$ and $g$, then the pair $(f,g)$ is in our relation if $f$ dominates $g$ and $g$ dominates $f$.

We'll call our relation $R$. As an example:

  • The pair $(n^2,n^2+\ln(n))$ is in $R$;
  • The pair $(n^2,n^2-\ln(n))$ is in $R$.

Properties of $R$

The first thing to note is that $R$ is Reflexive. For any function $f$, $f$ dominates $f$, so $(f,f)$ is in $R$.

The second thing to note is that it is symmetric. If $(f,g)$ is in our relation then $f$ dominates $g$ and $g$ dominates $f$. Therefore $g$ dominates $f$ and $f$ dominates $g$, and so $(g,f)$ is in $R$.

Finally, we observe that $R$ is transitive. If $(f,g)$ is in $R$, and $(g,h)$ is in $R$, then $(f,h)$ is in $R$. We'll leave the confirmation of this this as an exercise for the mythical interested reader.

So $R$ is reflexive, symmetric, and transitive, and hence is an "Equivalence Relation". If the pair $(f,g)$ are in $R$, then for the purposes of $R$ they are equivalent.

But what is the purpose of $R$?

Equivalence Classes of Functions

Since we have an Equivalence Relation between functions, we can now talk about the Equivalence Classes. So choose some function, say $n^2$, we can talk about all the other functions that are equivalent to it under our relation $R$. This will include:

It will, by now, come as no surprise to you that it's more complicated than this. Consider the function $n^2(\cos(n)+2)$ for example. This dominates and is dominated by $n^2$, and yet it does not have a "leading term of $n^2$".

Even so, we can observe that within the equivalence class that contains $n^2$, every other function requires more symbols to define it, so with the concept of "simplest possible" we can justify using $n^2$ as the canonical representative.

But we don't really have to, provided we understand that all these functions are equivalent under this relation. Even so, writing $n^2$ is succinct.

My thanks to Philipp Reinhard for the example above.

  • $n^2+100n$;
  • $n^2+\ln(n)$;
  • $6n^2-3\ln(n)$;

and many more. As you explore them you find that the thing they all have in common is the term $n^2$ (with caveats, see the side box), possibly with a constant multiple (well, always with a constant multiple if you remember that $n^2$ is actually 1 times $n^2$), and then some collection of more slowly growing functions added and subtracted at will.

So we can ask for a canonical representative of an Equivalence Class, and the obvious choice would seem to be to express it as the sum of functions, from among them choose the most rapidly growing term, and then ignore any leading constant coefficient.

So for each Equivalence Class we have a canonical representative that is, in some sense, as simple as possible. (Cue Kolmogorov Complexity Theory).

The Hierarchy

Now let's consider the relationships between the Equivalence Classes.

Consider two Equivalence Classes, $E_1$ and $E_2$. We can easily show that either functions from $E_1$ dominate those from $E_2$ or vice versa. You can go through the details, but what we find is that thinking of the Equivalence Classes as things in their own right, the relative dominance of their constituent functions is inherited by the Equivalence Classes to become a concept of dominance between them. As a result we have a total ordering on the Equivalence Classes.

And so we come to the heart of the Big-Oh concept.

Putting it into action

Consider the problem SQR of squaring a given number. The "size" of the number is the number of bits, and we have various algorithms to accomplish finding the square. For our example here we will consider the naive multiplication algorithm that we learned in school, and we can ask - just how much work is that?

Getting technical, we would have to give a precise definition of what we mean by a "step" in the algorithm, but speaking somewhat loosely we can see that if we have $d$ digits, the central section of the multiplication will have $d^2$ entries, and then we have to do some additions in each of up to $2d-1$ columns, etc. We therefore can see that the number of "steps" is some constant $k$ times $d^2$ plus some lower order terms, so the answer is something like:

  • $kd^2$ plus other stuff that's smaller.

But we can ignore all the other stuff!

This is a function in some equivalence class, and we can use the canonical representative of that class, which is $d^2$. Let's call the true "time-required" function $M(n)$. It's really quite complicated, but we now know that $d^2$ dominates $M$, and $M$ dominates $d^2$. So we may as well say that the time-complexity of this algorithm is $d^2$.

And that's what we do. The "Big-Oh" complexity of this algorithm is $n^2$, so the algorithm is said to be $O(n^2)$.

(Getting set soon to write some details of how we use these ideas as we analyse specific algorithms.)
<<<< Prev <<<<
Mathematical Relations
>>>> Next >>>>
Volume Of A Sphere ... You can follow me on Mathstodon.

Of course, you can also
follow me on twitter:


Send us a comment ...

You can send us a message here. It doesn't get published, it just sends us an email, and is an easy way to ask any questions, or make any comments, without having to send a separate email. So just fill in the boxes and then

Your name :
Email :
Message :



Links on this page

Site hosted by Colin and Rachel Wright:
  • Maths, Design, Juggling, Computing,
  • Embroidery, Proof-reading,
  • and other clever stuff.

Suggest a change ( <-- What does this mean?) / Send me email
Front Page / All pages by date / Site overview / Top of page

Universally Browser Friendly     Quotation from
Tim Berners-Lee
    Valid HTML 3.2!