Ignore differences which are constant - e.g., treat n and n/2 as same order of magnitude.

Similarly with 2 n^{2} and 1000 n^{2}.

In general if have polynomial of the form a0 n^{k} + a1
n^{k-1} + ... + ak , say it is O(n^{k}).

__Definition__: We say that g(n) is O(f(n)) if there exist two constants C
and k such that |g(n)| <= C |f(n)| for all n > k.

Equivalently, say g(n) is O(f(n)) if

there is a constant C such that for
all sufficiently large n, | g(n) / f(n) | <= C.

Most common are

O(1) - for any constant

O(log n), O(n), O(n log n), O(n^{2}), ..., O(2^{n})

Usually use these to measure time and space complexity of algorithms.

Insertion of new first element in an array of size n is O(n) since must bump all other elts up by one place.

Insertion of new last element in an array of size n is O(1).

Saw increasing array size by 1 at a time to build up to n takes time
n*(n-1)/2, which is O(n^{2}).

Saw increasing array size to n by doubling each time takes time n-1, which is O(n).

Make table of values to show difference.

Suppose have operations with time complexity O(log n), O(n), O(n log n),
O(n^{2}), and O(2^{n}).

And suppose all work on problem of size n in time t. How much time to do problem 10, 100, or 1000 times larger?

size | 10 n | 100 n | 1000 n |
---|---|---|---|

O(log n) | >3t | 10t | >30t |

O(n) | 10t | 100t | 1,000t |

O(n log n) | >30t | 1,000t | >30,000t |

O(n^{2}) | 100t | 10,000t | 1,000,000t |

O(2^{n}) | ~t10 | ~t100 | ~t1000 |

*Note that the last line depends on the fact that the constant is 1, otherwise the times are somewhat different.

Suppose get new machine that allows certain speed-up. How much larger problems can be solved? If original machine allowed solution of problem of size k in time t, then

speed-up | 1x | 10x | 100x | 1000x |
---|---|---|---|---|

O(log n) | k | k^{10} | k^{100} | k^{1000} |

O(n) | k | 10k | 100k | 1,000k |

O(n log n) | k | <10k | <100k | <1,000k |

O(n2) | k | 3k+ | 10k | 30k+ |

O(2n) | k | k+3 | k+7 | k+10 |

We will use big Oh notation to help us measure complexity of algorithms.

Only deal with searches here, come back to do sorts.

Code for all searches is on-line in Sorter program example

If list has n elements, then n compares in worst case.

- Average n/2 compares if element is in the list.
- Get n compares if element not in list.
- O(n) compares in all these cases.

- If middle elt is search elt then done.
- If middle elt smaller than search elt, then do binary search of bigger elts.
- If middle elt larger than search elt, the do binary search of smaller elts.

With each recursive call do at most two compares.

What is maximum number of recursive calls?

- Each time make recursive call,
divide size of array to be searched in half.
- How many times can divide number in half before only 1 elt left?
- If start with 2
^{k}then => 2^{k-1 }=> 2^{k-2 }=> 2^{k-3 }=> ...=> 2^{0}= 1; divide k times by 2. - In general can divide n by 2 at most
`log n`times to get down to 1.*In this course, write*`log n`for`log`_{2}n

Concrete comparison of worst cases: # of comparisons:

Search\# elts | 10 | 100 | 1000 | 1,000,000 |
---|---|---|---|---|

linear | 10 | 100 | 1000 | 1,000,000 |

binary | 8 | 14 | 20 | 40 |