Anyone who's studied computer science should be familiar with the Landau asymptotic notation, or "big-oh" notation. But how well do you really know it?
Given two functions it seems intuitive that one at least one ought to be big-oh of the other; in some cases maybe they are both big-oh of each other, in which case they are theta of each other and in a certain sense can be said to be equivalent. It sure feels like a total order, with a place for everything and everything in its place, and it is a total order on all the kinds of functions we normally care about. I've certainly marked down plenty of students for incorrectly claiming that n log n can't be compared either way to things like n and n2.But today I had occasion to wonder if it's really true that O() induces a total order on all functions for which it is applicable. I looked on the Net and couldn't find anyone discussing this issue; I did find this archived final exam in which someone asked it of students as a true/false question and apparently was looking for the answer "true." Unfortunately, the correct answer is "false."
Counterexample below the cut. If you've got a computer science background you might want to think about it a bit yourself before going further. It shouldn't really be hard if you look carefully at the definition.
f(n) = 1+n+n(-1)n
g(n) = 1+n-n(-1)n
f(n) ≠ O(g(n))
g(n) ≠ O(f(n))
These two functions alternate between being equal to 1 and being equal to 2n+1. The function f is 1 for odd n and 2n+1 for even n, and the function g is the reverse. The gap between them grows arbitrarily large as n grows, and its direction alternates. No matter what nonnegative constants you multiply the functions by, each will exceed the other half the time when n gets large enough.
Do note that each of my functions is defined for all positive integer inputs, and for positive integer inputs the function values are always positive integers. I'm not cheating with weird functions here. If you wanted them to have positive real values for all positive real inputs then you might have to play with the definitions a bit, because my definitions above will end up producing imaginary numbers for some non-integer values of n, but it's easy to see how you could use a scaled sine wave to interpolate the definitions above with something that would be everywhere continuous, positive, and real. And many people's definitions of O() only care about positive integers anyway.
This issue isn't directly discussed by Cormen, Leiserson, Rivest, and Stein (Introduction to Algorithms, Third Edition; it's a standard text for this kind of thing and the one I chose for a course I recently taught.) Their Problem 3-5 seems to be aimed at the question of total ordering, however, because it talks about an alternative definition for big-omega that helps force functions into a total order. I also checked Knuth (The Art of Computer Programming, volume 1) and was a little surprised to find that he doesn't seem to discuss the question of whether big-oh is a total order at all.
So there you have it. I'm not sure it's really all that big a deal - anyone with a reason to care strongly about whether big-oh is a total order should be able to figure it out for themselves easily enough once they think about it - but I, at least, had never had a reason to think about it seriously until today. My students are probably lucky I didn't, because I might have put it on the COMP 3170 final.
The question I was actually interested in before I got onto this one was the slightly more complicated question of how many distinct theta-classes of functions exist. It's easy to see that there are as many as there are real numbers, but when we start throwing in things like logs, can we make them more numerous than that? The answer seems to be that we can construct a distinct class for every function we allow as input, but don't cite me on that because I haven't finished formalizing the proof. It might make a good final exam question for a course a bit more advanced than 3170.