Review WordFreq example from text.
Ignore differences which are constant - e.g., treat n and n/2 as same order of magnitude.
Similarly with 2 n2 and 1000 n2.
In general if have polynomial of the form a0 nk + a1 nk-1 + ... + ak , say it is O(nk).
Definition: We say that g(n) is O(f(n)) if there exist two constants C and k such that |g(n)| <= C |f(n)| for all n > k.
Equivalently, say g(n) is O(f(n)) if
there is a constant C such that for
all sufficiently large n, | g(n) / f(n) | <= C.
Most common are
O(1) - for any constant
O(log n), O(n), O(n log n), O(n2), ..., O(2n)
Usually use these to measure time and space complexity of algorithms.
Insertion of new first element in an array of size n is O(n) since must bump all other elts up by one place.
Insertion of new last element in an array of size n is O(1).
Saw increasing array size by 1 at a time to build up to n takes time n*(n-1)/2, which is O(n2).
Saw increasing array size to n by doubling each time takes time n-1, which is O(n).
Make table of values to show difference.
Suppose have operations with time complexity O(log n), O(n), O(n log n), O(n2), and O(2n).
And suppose all work on problem of size n in time t. How much time to do problem 10, 100, or 1000 times larger?
size | 10 n | 100 n | 1000 n |
---|---|---|---|
O(log n) | >3t | 10t | >30t |
O(n) | 10t | 100t | 1,000t |
O(n log n) | >30t | 1,000t | >30,000t |
O(n2) | 100t | 10,000t | 1,000,000t |
O(2n) | ~t10 | ~t100 | ~t1000 |
*Note that the last line depends on the fact that the constant is 1, otherwise the times are somewhat different.
Suppose get new machine that allows certain speed-up. How much larger problems can be solved? If original machine allowed solution of problem of size k in time t, then
speed-up | 1x | 10x | 100x | 1000x |
---|---|---|---|---|
O(log n) | k | k10 | k100 | k1000 |
O(n) | k | 10k | 100k | 1,000k |
O(n log n) | k | <10k | <100k | <1,000k |
O(n2) | k | 3k+ | 10k | 30k+ |
O(2n) | k | k+3 | k+7 | k+10 |
We will use big Oh notation to help us measure complexity of algorithms.
Only deal with searches here, come back to do sorts.
Code for all searches is on-line in Sort program example
If list has n elements, then n compares in worst case.
With each recursive call do at most two compares.
What is maximum number of recursive calls?
Concrete comparison of worst cases: # of comparisons:
Search\# elts | 10 | 100 | 1000 | 1,000,000 |
---|---|---|---|---|
linear | 10 | 100 | 1000 | 1,000,000 |
binary | 8 | 14 | 20 | 40 |