Temporary Work Space

 Ignore this page, it's temporary working, to be integrated into other pages, or deleted.
..

x.

x.

..

..

..

..

..

..

..

..

..

..

..

..

..

..

..

..

..

So really, everything that dominates $f(n)=n$ is either and in a very real sense that absolutely is the best we can do. In particular, for all the other functions, the linear function is dominated by them! In other words, of all the upper bounds (in this sense), the linear function is the "smallest."

(We are using the option given to us that we can multiply up by a suitable constant, and then our dominating function is always larger.)

In words, for this (hypothetical) algorithm, the time taken is less than a linear function, and the linear function is the "best" upper bound.

That is why we say that this algorithm takes linear time.

In summary

So what have we ended up with?

 This can be summarised as "sketch how long we think the algorithm takes.
Given a problem, given an algorithm for that problem, given an analysis for that algorithm for that problem, we can look at its worst case behaviour as a function of the size of the instance.

Look at the long-term behaviour of that function.

What we want is to find the slowest growing function that dominates (in the technical sense) the algorithm's behaviour.

That is the time-complexity of our analysis of that algorithm.

Usually we are less precise, less rigorous, less pedantic, and we say that it's "the time-complexity of the algorithm," because usually our analysis is "close enough".

So when we say that an algorithm is $O(n^2)$ what we mean is this:

• In the long run the time taken is bounded by $c\cdot{n^2}$ for some constant $c$;
• We don't know of any slower growing function that is certain to be an upper-bound for the run time.

And that, in a nutshell, is an introduction to the "Big Oh" notation.