« The parable of the tricycle | Home | What (not) to say to someone w... »

Big-oh is not a total order

Wed 31 Dec 1969 by mskala Tags used:

Anyone who's studied computer science should be familiar with the Landau asymptotic notation, or "big-oh" notation. But how well do you really know it?

Given two functions it seems intuitive that one at least one ought to be big-oh of the other; in some cases maybe they are both big-oh of each other, in which case they are theta of each other and in a certain sense can be said to be equivalent. It sure feels like a total order, with a place for everything and everything in its place, and it is a total order on all the kinds of functions we normally care about. I've certainly marked down plenty of students for incorrectly claiming that n log n can't be compared either way to things like n and n2.

But today I had occasion to wonder if it's really true that O() induces a total order on all functions for which it is applicable. I looked on the Net and couldn't find anyone discussing this issue; I did find this archived final exam in which someone asked it of students as a true/false question and apparently was looking for the answer "true." Unfortunately, the correct answer is "false."

Counterexample below the cut. If you've got a computer science background you might want to think about it a bit yourself before going further. It shouldn't really be hard if you look carefully at the definition.

f(n) = 1+n+n(-1)n

g(n) = 1+n-n(-1)n

f(n) ≠ O(g(n))

g(n) ≠ O(f(n))

These two functions alternate between being equal to 1 and being equal to 2n+1. The function f is 1 for odd n and 2n+1 for even n, and the function g is the reverse. The gap between them grows arbitrarily large as n grows, and its direction alternates. No matter what nonnegative constants you multiply the functions by, each will exceed the other half the time when n gets large enough.

Do note that each of my functions is defined for all positive integer inputs, and for positive integer inputs the function values are always positive integers. I'm not cheating with weird functions here. If you wanted them to have positive real values for all positive real inputs then you might have to play with the definitions a bit, because my definitions above will end up producing imaginary numbers for some non-integer values of n, but it's easy to see how you could use a scaled sine wave to interpolate the definitions above with something that would be everywhere continuous, positive, and real. And many people's definitions of O() only care about positive integers anyway.

This issue isn't directly discussed by Cormen, Leiserson, Rivest, and Stein (Introduction to Algorithms, Third Edition; it's a standard text for this kind of thing and the one I chose for a course I recently taught.) Their Problem 3-5 seems to be aimed at the question of total ordering, however, because it talks about an alternative definition for big-omega that helps force functions into a total order. I also checked Knuth (The Art of Computer Programming, volume 1) and was a little surprised to find that he doesn't seem to discuss the question of whether big-oh is a total order at all.

So there you have it. I'm not sure it's really all that big a deal - anyone with a reason to care strongly about whether big-oh is a total order should be able to figure it out for themselves easily enough once they think about it - but I, at least, had never had a reason to think about it seriously until today. My students are probably lucky I didn't, because I might have put it on the COMP 3170 final.

The question I was actually interested in before I got onto this one was the slightly more complicated question of how many distinct theta-classes of functions exist. It's easy to see that there are as many as there are real numbers, but when we start throwing in things like logs, can we make them more numerous than that? The answer seems to be that we can construct a distinct class for every function we allow as input, but don't cite me on that because I haven't finished formalizing the proof. It might make a good final exam question for a course a bit more advanced than 3170.

15 comments

*
Doesn't big-O notation just care about least upper bounds provided by elementary functions? Your f and g are both O(n)/

I think the intuition built into big-O notation assumes that computational complexity isn't simply positive, but increasing.
T. Rev - 2011-07-21 15:09
*
They are both O(n), but the important question is whether they are O() of each other. For any given function there are an unlimited number of bigger functions - for instance, n is O(n) but is also O(n log n), O(n^2), and O(n^3). The point that interests me is the question of whether O() can always fit functions into a straightforward sequence where any two are always either in the same class, or one in a bigger class than the other. The fact it's not a total order says there are some incomparable pairs, where we can't correctly say that one is bigger, but they also aren't the same size.

It's true that we ordinarily apply O() to functions that are asymptotically monotonic - either always increasing or always decreasing once n gets large enough - and I suspect that that may force it to be a total order. So far I haven't found a counterexample using only monotonic functions. But the most common official definition does not require monotonic functions.
Matt - 2011-07-21 16:20
*
Ah, I see you write "elementary" functions. Well, my examples are just sums and products of polynomials and exponentials; and if you don't like powers of -1 you could get the same effect with sin and cos. For sure you could define "elementary function" narrowly enough to exclude these, but the result is going to be more restrictive than I usualy expect to allow inside O(). Limiting what can be inside O() in SOME way is probably the right thing to do if we want to force it to be a total order, though.
Matt - 2011-07-21 16:24
*
A counterexample using monotonic functions would be something like...

F(n) = sqrt(n)*product_{1 to n} f(i)
G(n) = product_{1 to n} g(i)

I haven't checked the exact formula there, but the point is that the ratio F(n)/G(n) should alternate between O(sqrt(n)) and O(1/sqrt(n)).

If I am reading the final exam you linked correctly, it doesn't assert that O() induces a total ordering, it asserts that it induces an equivalence relation. Your f and g don't constitute a counterexample because they are incomparable.
T. Rev - 2011-07-21 16:55
*
Your F() and G() look good.

As for the exam, I'm looking at question 5.(f) and I think you're looking at question 6.(a).
Matt - 2011-07-21 17:05
*
Oop, yes, you're absolutely correct. Well, looks like we showed that guy.
T. Rev - 2011-07-21 17:07
*
Tougher question: can you construct a pair of C^infinity functions, such that all their derivatives are monotonic, that are not comparable? How big can you make a function class before it stops being totally ordered?
T. Rev - 2011-07-21 19:57
*
I would think that it should be possible. Start with the sine/cosine thing, which I think comes out to f(x)=1+x+(x cos (pi x)); g(x)=1+x-(x cos (pi x)). Note that cos (pi x) is just a C^infinity interpolation of the integer function (-1)^n.

Then although I'm not certain of the details, I would think it should be possible to make a continuous version of your "product" trick above: something like exp integral_0^x log f(t) dt. The log might not be necessary or even desirable, and multiplying or adding constants just inside the exp might also be necessary to sync f() with g(). The point would be that like a plain old exponential, this mess would have all its derivatives monotonically increasing, but it still alternates relatively fast and slow parts and we could arrange two of them with those alternations out of phase.
Matt - 2011-07-21 21:30
*
It seems though that if one restricts the relation to functions which are eventually monotone non-decreasing (i.e. there exists M s.t. a>b>M implies f(a) >= f(b)), then we end up with a total ordering. That pretty much any case a programmer would consider lies within this restricted space would hint at why Dr. Knuth might have omitted mention of total ordering from TAOCP. That said, I love counterexamples and degenerate constructions, so... (though this one left me a little less satisfied than the 1/q function, the subset of R from which 14 different sets can be constructed by taking closures and complements, or Villadsen's perforated C*-algebras--though that last one's kind of a high bar, since I used it to set the initial direction for my thesis research...)
kiwano - 2011-07-22 09:20
*
What about T. Rev's F() and G() above, though? I think they (or some close variation of them) are monotonic nondecreasing but incomparable.
Matt - 2011-07-22 11:18
*
F() and G() are monotonic*--at least over Z!--but their derivatives aren't. It seems as if getting a pair of noncomparable functions requires some kind of 'wiggliness'. This is easier to see if we look at their logs:

ln(F(n)) = sum_{1 to n} ln(f(i))
ln(G(n)) = sum_{1 to n} ln(g(i))

We can pass to the continuous case by using the continous (and wiggly) sin-based versions of f and g:

ln(F(x)) = integral_1^x ln(f(i)) di
ln(G(x)) = integral_1^x ln(g(i)) di

Taking derivatives we get F'(x)/F(x) = f(x) and G'(x)/G(x) = g(x), from which it is easy to show that not all derivatives of F and G are monotonic.

On the other hand, we know that O() does give a total ordering on the usual suspect functions--polynomials, e^x, ln(x), x^c--and they all share the property that all their derivatives are monotonic. So it seems like 'all derivatives monotonic' might be a good starting criterion for further investigation.
T. Rev - 2011-07-23 01:44
*
Just as a footnote, note that monotonic increasing isn't the right criterion, since e.g. the second derivative of ln(x) is (monotonic) decreasing!
T. Rev - 2011-07-23 01:46
*
OK, I think I have it figured out. For simplicity let's assume f and g are always positive.

If f(x) != O(g), that means for all constants M and real x_0, there exists x > x_0 such that f(x)/g(x) > M, and likewise if g(x) != O(f) we have a similar condition on g(x)/f(x). So both ratios f/g and g/f get arbitarily large.

Let's define L(x) = ln(f(x)) - ln(g(x)). From the above we can say that if f and g are incomparable, then for any constant M and real x_0, there exist x and x' > x_0 such that L(x) > M and L(x') < M. Less formally, it means as x -> infinity, L(x) is always oscillating between positive and negative values of arbitrarily large magnitude. From that, we can conclude that if f and g are incomparable, at least one of them is 'wiggly'.
T. Rev - 2011-07-23 03:03
*
I think kiwano proposed "monotonic increasing" as a sufficient criterion, not as a necessary one; it seems intuitive to me that if F() and G() work then you could take 1/F() and 1/G() and get a pair that's monotonic in the opposite direction and also works, so if monotonic increasing were sufficient then monotonic decreasing would also be sufficient. Your description of what must be true for a pair to be incomparable seems to be correct, but I'm not sure that there's much difference between it and just saying that neither function is O() of the other; it seems like just a restatement of the definition.

I'm not too interested in requiring continuity; the functions we put inside O() in computer science are generally functions of integers anyway. I'd be more interested in finding as large a function family as possible for which it's a total order. Pretty often we use polynomials, which clearly work (cannot be incomparable). Pretty often we use exponentials c^n where c is a *positive real* constant. Those seem to work as well; but allowing negative real bases like c=-1, or complex numbers (which would be how you'd get to sin and cos), clearly doesn't work. Also, we often use logs, including iterated logs. Something like (sqrt(n) log n)/log log n might be typically seen.

If we take the closure of add, subtract, exp, log, and function composition, but only allowing real values (the function just becomes undefined at that point if you try to take the log of a negative number, but such functions would still be allowed if they're defined for all sufficiently large n) is that family of functions totally ordered by O()? I think it might be, and it's big enough to cover pretty much all the things I normally expect to see inside O() in computer science.
Matt - 2011-07-23 09:22
*
"Monotonic increasing" clearly isn't a sufficient condition, since F and G are monotonic increasing and incomparable in both the continuous and discrete cases.

As far as restating the definitions, try looking at it this way:

A pair of functions f and g are incomparable, if and only if f/g is incomparable to a constant, if and only if f/g has the 'wiggly' property of attaining arbitrarily large and small values as x -> infinity. This follows trivially from the definition, but it suggests that looking at pairs of functions is unnecessarily complicated, and continuity is also irrelevant: it's enough to consider the functions that are incomparable to a constant, and those functions are easy to characterize.
T. Rev - 2011-07-23 15:07


(optional field)
(optional field)
Answer "bonobo" here to fight spam. ここに「bonobo」を答えてください。SPAMを退治しましょう!
I reserve the right to delete or edit comments in any way and for any reason. New comments are held for a period of time before being shown to other users.