11.07.2015 Views

Data Structures and Algorithm Analysis - Computer Science at ...

Data Structures and Algorithm Analysis - Computer Science at ...

Data Structures and Algorithm Analysis - Computer Science at ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Sec. 3.4 Asymptotic <strong>Analysis</strong> 67it is in Ω(h(n)). Note th<strong>at</strong> we drop the word “in” for Θ not<strong>at</strong>ion, because thereis a strict equality for two equ<strong>at</strong>ions with the same Θ. In other words, if f(n) isΘ(g(n)), then g(n) is Θ(f(n)).Because the sequential search algorithm is both in O(n) <strong>and</strong> in Ω(n) in theaverage case, we say it is Θ(n) in the average case.Given an algebraic equ<strong>at</strong>ion describing the time requirement for an algorithm,the upper <strong>and</strong> lower bounds always meet. Th<strong>at</strong> is because in some sense we havea perfect analysis for the algorithm, embodied by the running-time equ<strong>at</strong>ion. Formany algorithms (or their instanti<strong>at</strong>ions as programs), it is easy to come up withthe equ<strong>at</strong>ion th<strong>at</strong> defines their runtime behavior. Most algorithms presented in thisbook are well understood <strong>and</strong> we can almost always give a Θ analysis for them.However, Chapter 17 discusses a whole class of algorithms for which we have noΘ analysis, just some uns<strong>at</strong>isfying big-Oh <strong>and</strong> Ω analyses. Exercise 3.14 presentsa short, simple program fragment for which nobody currently knows the true upperor lower bounds.While some textbooks <strong>and</strong> programmers will casually say th<strong>at</strong> an algorithm is“order of” or “big-Oh” of some cost function, it is generally better to use Θ not<strong>at</strong>ionr<strong>at</strong>her than big-Oh not<strong>at</strong>ion whenever we have sufficient knowledge about an algorithmto be sure th<strong>at</strong> the upper <strong>and</strong> lower bounds indeed m<strong>at</strong>ch. Throughout thisbook, Θ not<strong>at</strong>ion will be used in preference to big-Oh not<strong>at</strong>ion whenever our st<strong>at</strong>eof knowledge makes th<strong>at</strong> possible. Limit<strong>at</strong>ions on our ability to analyze certainalgorithms may require use of big-Oh or Ω not<strong>at</strong>ions. In rare occasions when thediscussion is explicitly about the upper or lower bound of a problem or algorithm,the corresponding not<strong>at</strong>ion will be used in preference to Θ not<strong>at</strong>ion.3.4.4 Simplifying RulesOnce you determine the running-time equ<strong>at</strong>ion for an algorithm, it really is a simplem<strong>at</strong>ter to derive the big-Oh, Ω, <strong>and</strong> Θ expressions from the equ<strong>at</strong>ion. You do notneed to resort to the formal definitions of asymptotic analysis. Instead, you can usethe following rules to determine the simplest form.1. If f(n) is in O(g(n)) <strong>and</strong> g(n) is in O(h(n)), then f(n) is in O(h(n)).2. If f(n) is in O(kg(n)) for any constant k > 0, then f(n) is in O(g(n)).3. If f 1 (n) is in O(g 1 (n)) <strong>and</strong> f 2 (n) is in O(g 2 (n)), then f 1 (n) + f 2 (n) is inO(max(g 1 (n), g 2 (n))).4. If f 1 (n) is in O(g 1 (n)) <strong>and</strong> f 2 (n) is in O(g 2 (n)), then f 1 (n)f 2 (n) is inO(g 1 (n)g 2 (n)).The first rule says th<strong>at</strong> if some function g(n) is an upper bound for your costfunction, then any upper bound for g(n) is also an upper bound for your cost function.A similar property holds true for Ω not<strong>at</strong>ion: If g(n) is a lower bound for your

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!