12.07.2015 Views

A Practical Introduction to Data Structures and Algorithm Analysis

A Practical Introduction to Data Structures and Algorithm Analysis

A Practical Introduction to Data Structures and Algorithm Analysis

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

86 Chap. 3 <strong>Algorithm</strong> <strong>Analysis</strong>of its correct position once it has been placed there, <strong>and</strong> each swap operation placesat least one integer in its correct position. Thus, this code fragment has cost Θ(n).However, it requires more time <strong>to</strong> run than the first code fragment. On my computerthe second version takes nearly twice as long <strong>to</strong> run as the first, but it only requireshalf the space.A second principle for the relationship between a program’s space <strong>and</strong> timerequirements applies <strong>to</strong> programs that process information s<strong>to</strong>red on disk, as discussedin Chapter 8 <strong>and</strong> thereafter. Strangely enough, the disk-based space/timetradeoff principle is almost the reverse of the space/time tradeoff principle for programsusing main memory.The disk-based space/time tradeoff principle states that the smaller you canmake your disk s<strong>to</strong>rage requirements, the faster your program will run. This is becausethe time <strong>to</strong> read information from disk is enormous compared <strong>to</strong> computationtime, so almost any amount of additional computation needed <strong>to</strong> unpack the data isgoing <strong>to</strong> be less than the disk-reading time saved by reducing the s<strong>to</strong>rage requirements.Naturally this principle does not hold true in all cases, but it is good <strong>to</strong> keepin mind when designing programs that process information s<strong>to</strong>red on disk.3.10 Speeding Up Your ProgramsIn practice, there is not such a big difference in running time between an algorithmwhose growth rate is Θ(n) <strong>and</strong> another whose growth rate is Θ(n log n). There is,however, an enormous difference in running time between algorithms with growthrates of Θ(n log n) <strong>and</strong> Θ(n 2 ). As you shall see during the course of your studyof common data structures <strong>and</strong> algorithms, it is not unusual that a problem whoseobvious solution requires Θ(n 2 ) time also has a solution that requires Θ(n log n)time. Examples include sorting <strong>and</strong> searching, two of the most important computerproblems.Example 3.18 The following is a true s<strong>to</strong>ry. A few years ago, one ofmy graduate students had a big problem. His thesis work involved severalintricate operations on a large database. He was now working on the finalstep. “Dr. Shaffer,” he said, “I am running this program <strong>and</strong> it seems <strong>to</strong>be taking a long time.” After examining the algorithm we realized that itsrunning time was Θ(n 2 ), <strong>and</strong> that it would likely take one <strong>to</strong> two weeks<strong>to</strong> complete. Even if we could keep the computer running uninterruptedfor that long, he was hoping <strong>to</strong> complete his thesis <strong>and</strong> graduate beforethen. Fortunately, we realized that there was a fairly easy way <strong>to</strong> convertthe algorithm so that its running time was Θ(n log n). By the next day he

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!