Ignore differences which are constant - e.g., treat n and n/2 as same order of magnitude.
Similarly with 2 n2 and 1000 n2.
In general if have polynomial of the form a0 nk + a1 nk-1 + ... + ak , say it is O(nk).
Definition: We say that g(n) is O(f(n)) if there exist two constants C and k such that |g(n)| <= C |f(n)| for all n > k.
Equivalently, say g(n) is O(f(n)) if
there is a constant C such that for all sufficiently large n, | g(n) / f(n) | <= C.
Most common are
O(1) - for any constant
O(log n), O(n), O(n log n), O(n2), ..., O(2n)
Usually use these to measure time and space complexity of algorithms.
Insertion of new first element in an array of size n is O(n) since must bump all other elts up by one place.
Insertion of new last element in an array of size n is O(1).
Saw increasing array size by 1 at a time to build up to n takes time n*(n-1)/2, which is O(n2).
Saw increasing array size to n by doubling each time takes time n-1, which is O(n).
Make table of values to show difference.
Suppose have operations with time complexity O(log n), O(n), O(n log n), O(n2), and O(2n).
And suppose all work on problem of size n in time t. How much time to do problem 10, 100, or 1000 times larger?
|size||10 n||100 n||1000 n|
|O(n log n)||>30t||1,000t||>30,000t|
*Note that the last line depends on the fact that the constant is 1, otherwise the times are somewhat different.
Suppose get new machine that allows certain speed-up. How much larger problems can be solved? If original machine allowed solution of problem of size k in time t, then
|O(n log n)||k||<10k||<100k||<1,000k|
We will use big Oh notation to help us measure complexity of algorithms.
Only deal with searches here, come back to do sorts.
Code for all searches is on-line in Sorter program example
If list has n elements, then n compares in worst case.
With each recursive call do at most two compares.
What is maximum number of recursive calls?
Concrete comparison of worst cases: # of comparisons: