04.09.2013 Views

Algorithm Design

Algorithm Design

Algorithm Design

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

334<br />

Chapter 6 Dynamic Programming<br />

28.<br />

they need to pay a fixed price P for delivery in addition to the cost of the<br />

gas ordered. However, it costs c to store a gallon of gas for an extra day,<br />

so ordering too much ahead increases the storage cost.<br />

They are planning to close for a week in the winter, and they want<br />

their tank to be empty by the time they close. Luckily, based on years of<br />

experience, they have accurate projections for how much gas they will<br />

need each day tmtfl this point in time. Assume that there are n days left<br />

unti! they close, and they need gt gallons of gas for each of the days<br />

i = 1 ..... n. Assume that the tank is empty at the end of day 0. Give an<br />

algorithm to decide on which days they should place orders, and how<br />

29.<br />

much to order so as to minimize their total cost.<br />

Recall the scheduling problem from Section 4.2 in which we sought to<br />

~ze the maximum lateness. There are r~ jobs, each with a deadline<br />

dt and a required processing time tt, and all jobs are available to be<br />

scheduled starting at l~me s. For a job i to be done, it needs tO be assighed<br />

a period from st >_ s to ft = st + tt, and different jobs should be assigned<br />

nonoverlapping intervals. As usual, such an assignment of times will be<br />

called a schedule.<br />

In this problem, we consider the same setup, but want to optimize a<br />

different objective. In particular, we consider the case in which each job<br />

must either be done by its deadline or not at all. We’ll say that a subset J of<br />

the jobs is schedulable if there is a schedule for the jobs in J so that each<br />

of them finishes by its deadline. Your problem is to select a schedulable<br />

subset of maximum possible size and give a schedule for this subset that<br />

allows each job to finish by its deadline.<br />

(a) Prove that there is an optimal solution J (i.e., a schedulable set of<br />

(b)<br />

maximum size) in which the jobs in J are scheduled in increasing<br />

order of their deadlines.<br />

Assume that a~ deadlines dt and required times tt are integers. Give<br />

an algorithm to find an optimal solution. Your algorithm should<br />

run in time polynomial in the number of jobs n, and the maximum<br />

deadline D = rnaxt dr.<br />

Let G : (V, E) be a graph with n nodes in which each pair of nodes is<br />

joined by an edge. There is a positive weight tvq on each edge (i,j); and<br />

we will assume these weights satisfy the triangle inequality tvtk < tvq + tvjk.<br />

For a subset V’ ___ V, we will use G[V’] to denote the subgraph (with edge<br />

weights) induced on the nodes in V’.<br />

We are given a set X c V of k terminals that must be connected by<br />

edges. We say that a Stein~r tree on X is a set Z so that X __ Z _ V, together<br />

Notes and Further Reading<br />

with a spanning subtree T of G[Z]. The weight of the Steiner tree is the<br />

weight of the tree T.<br />

Show that there is function f(.) and a polynomial function p(.) so that<br />

the problem of finding a ~um-weight Steiner tree on X can be solved<br />

in time O(f(k). p(n)).<br />

Notes and Further Reading<br />

Richard Bellman is credited with pioneering the systematic study of dynamic<br />

programming (Bellman 1957); the algorithm in this chapter for segmented least<br />

squares is based on Bellman’s work from this early period (Bellman 1961).<br />

Dynamic programming has since grown into a technique that is widely used<br />

across computer science, operations research, control theory, and a number<br />

of other areas. Much of the recent work on this topic has been concerned with<br />

stochastic dynamic programming: Whereas our problem formulations tended<br />

to tacitly assume that al! input is known at the outset, many problems in<br />

scheduling, production and inventory planning, and other domains involve<br />

uncertainty, and dynamic programming algorithms for these problems encode<br />

this uncertainty using a probabilistic formulation. The book by Ross (1983)<br />

provides an introduction to stochastic dynamic programming.<br />

Many extensions and variations of the Knapsack Problem have been<br />

studied in the area of combinatorial optimization. As we discussed in the<br />

chapter, the pseudo-polynomial bound arising from dynamic programming<br />

can become prohibitive when the input numbers get large; in these cases,<br />

dynamic programming is often combined with other heuristics to solve large<br />

instances of Knapsack Problems in practice. The book by Martello and Toth<br />

(1990) is devoted to computational approaches to versions of the Knapsack<br />

Problem.<br />

Dynamic programming emerged as a basic technique in computational biology<br />

in the early 1970s, in a flurry of activity on the problem of sequence<br />

comparison. Sankoff (2000) gives an interesting historical account of the early<br />

work in this period. The books by Waterman (1995) and Gusfield (1997) provide<br />

extensive coverage of sequence alignment algorithms (as well as many<br />

related algorithms in computational biology); Mathews and Zuker (2004) discuss<br />

further approaches to the problem of RNA secondary structure prediction.<br />

The space-efficient algorithm for sequence alignment is due to Hirschberg<br />

(1975).<br />

The algorithm for the Shortest-Path Problem described in this chapter is<br />

based originally on the work of Bellman (1958) and Ford (1956). Many optimizations,<br />

motivated both by theoretical and experimental considerations,<br />

335

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!