Chapter 3: Optimal Trees and Branchings - UKP
Chapter 3: Optimal Trees and Branchings - UKP
Chapter 3: Optimal Trees and Branchings - UKP
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Efficient Graph Algorithms<br />
III <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
III <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong><br />
<strong>Branchings</strong><br />
4<br />
1<br />
3<br />
2<br />
2<br />
8<br />
7<br />
3<br />
8<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 1
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST Problem Description<br />
MST problem arises from the area of network design: connecting nodes with<br />
minimal cost / distance / ...<br />
Given: Connected graph G = (V , E) with positive edge costs c e > 0 for<br />
all e ∈ E.<br />
Sought: Edge set T ⊆ E, such that (V , T ) is connected <strong>and</strong> ∑ c e is<br />
minimal.<br />
If (V , T ) is a tree, it is called minimum spanning tree (MST).<br />
Observation 1<br />
An optimal solution for the above problem is always a (minimum spanning)<br />
tree.<br />
What about...<br />
...calculating all spanning trees <strong>and</strong> taking the one implying minimum weight?<br />
e∈T<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 2
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MAX-FOREST<br />
Total enumeration does not make any sense, because:<br />
Cayley: there are n n−2 spanning trees in K n .<br />
Assume we are able to enumerate 10 6 trees per second<br />
with n = 30 this would take 30 28 /10 6 seconds = 7.25 · 10 27 years.<br />
MAX-FOREST: find a forest of maximal weight (with positive edge costs,<br />
creation of a forest of minimal weight is not useful at all).<br />
Definition 2<br />
Problem classes P <strong>and</strong> Q are called equivalent if there is a linear<br />
transformation between P <strong>and</strong> Q, i.e. there are linear time functions f , g<br />
with:<br />
f transforms instances x ∈ P into instances y ∈ Q,<br />
g transforms solutions z of instance f (x) into solutions of x ∈ P,<br />
z optimal in f (x) ⇐⇒ g(z) optimal in x.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 3
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST ≃ MAX-FOREST<br />
Lemma 3<br />
MST <strong>and</strong> MAX-FOREST are equivalent.<br />
Proof. Let G be connected <strong>and</strong> weighted. Let A be an algorithm computing<br />
a MAX-FOREST.<br />
Set M := max{|c e | ∣ ∣ e ∈ E} + 1 <strong>and</strong> c<br />
′<br />
e := M − c e , ∀ e ∈ E.<br />
Use algorithm A to compute a MAX-FOREST F of G with respect to c ′ e.<br />
G connected, c ′ e > 0, ∀ e ∈ E ⇒ F is a tree.<br />
Because of the complementation of c e with respect to M, F is minimal for<br />
G with original weights c e .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 4
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST ↔ MAX-FOREST<br />
Proof (cont’d). Let now A ′ be an algorithm that computes an MST. The<br />
goal is to find a MAX-FOREST.<br />
Regard complete graph K n <strong>and</strong> set M := n · (max{|c e | ∣ ∣e ∈ E} + 1 <strong>and</strong><br />
c ′ e :=<br />
{<br />
−ce , for all e ∈ E with c e > 0,<br />
M, otherwise.<br />
A ′ computes an MST T for K n with weights c ′ e.<br />
Obviously, T \ {e ∈ T ∣ ∣ c<br />
′<br />
e = M} is a MAX-FOREST in G w.r.t. the original<br />
weights c e .<br />
Alternatively, we might compute the components of G (negative edges<br />
are removed immediately), complement the edge weights <strong>and</strong> apply<br />
algorithm A ′ on the components.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 5
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
A General Greedy Approach<br />
We give a general algorithm that labels edges either blue or red . During the<br />
procedure, at any time the invariant is satisfied:<br />
Among all MSTs, there is one containing<br />
all blue edges <strong>and</strong> no red edge.<br />
During the procedure, we use the following labeling rules:<br />
blue rule (cut property):<br />
Choose a cut that does not contain any blue edge. Choose the shortest<br />
unlabeled edge in this cut <strong>and</strong> label it blue .<br />
red rule (cycle property):<br />
Choose a cycle that does not contain any red edge. Choose the longest<br />
unlabeled edge in this cycle <strong>and</strong> label it red .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 6
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST(G, c)<br />
MST(G, c)<br />
1. All edges are unlabeled.<br />
2. Apply rules blue <strong>and</strong> red until all edges are labeled.<br />
3. The blue edges form an MST.<br />
Proposition 4<br />
MST(G, c) computes an MST.<br />
Proof.<br />
Initially, we show that the above invariant holds. Trivially, it holds at the<br />
beginning. Thus, it is to show that application of the blue <strong>and</strong> red rules does<br />
not violate it.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 7
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of MST(G, c) (1)<br />
Correctness of the blue rule<br />
Let e ∈ E be labeled blue in Step 2. If e ∈ T , nothing is to show. Let e /∈ T<br />
<strong>and</strong> δ(W ) be the associated cut. Then, there is a path connecting the two<br />
nodes incident with e, <strong>and</strong> at least one of its edges is in δ(W ).<br />
e<br />
e’<br />
Let e ′ be such an edge. As the invariant holds, no edge in T is labeled red<br />
<strong>and</strong> e ′ is unlabeled. According to the blue rule, it is c e ′ ≥ c e . If we replace e ′<br />
by e in T , another MST satisfying the invariant emerges. In particular, c e ′ = c e .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 8
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of MST(G, c) (2)<br />
Correctness of the red rule<br />
e <strong>and</strong> T as above. If e /∈ T , then the invariant is satisfied. Therefore, assume<br />
e ∈ T . If we remove e from T , T is not connected anymore. Thus, the cycle<br />
used in the red rule, must have contained another edge e ′ with endnodes in<br />
both subtrees.<br />
e<br />
e ′ has<br />
not been labeled yet, <strong>and</strong> according to the red rule, c e ′ ≤ c e holds. As above,<br />
by replacing e by e ′ , we obtain a new MST satisfying the invariant.<br />
e’<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 9
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of MST(G, c) (3)<br />
Correctness of MST(G, c)<br />
Remains to show that all edges are labeled during MST(G, c):<br />
Let e be unlabeled. According to the invariant, all blue edges form a forest<br />
(possibly containing isolated nodes). We regard an edge e = (u, v):<br />
(Case 1) u <strong>and</strong> v are part of the same tree: apply the red rule for<br />
eliminating the cycle consisting of e <strong>and</strong> the path u → v in the tree connecting<br />
u <strong>and</strong> v.(Case 2) u <strong>and</strong> v are part of different trees: a cut δ(W ) is induced<br />
by the nodes of either the tree containing u, or the tree containing v. One<br />
edge (not necessarily e) is labeled blue .<br />
Note that, in the above algorithm, the labeling is non-deterministic. In order<br />
to obtain an efficient implementation, we have to specify the application of the<br />
labeling rules more precisely.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 10
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Matroids<br />
Definition 5 (Matroid)<br />
A matroid is a pair (E, U), where U is a subset system of E which satisfies<br />
the following properties:<br />
1. ∅ ∈ U<br />
2. A ⊆ B, B ∈ U =⇒ A ∈ U<br />
3. A, B ∈ U, |A| < |B| =⇒ ∃ x ∈ B \ A with A ∪ {x} ∈ U (exchange property)<br />
The elements of E are called elements of the matroid, elements of U are<br />
called independent subsets of (E, U).<br />
Theorem 6<br />
Let (E, U) be a matroid. Then, the canonical Greedy algorithm is optimal w.r.t.<br />
any weighting function w : E → R.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 11
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST <strong>and</strong> Matroids<br />
Theorem 7<br />
The subset system of MST is a matroid.<br />
Proof.<br />
Properties 1 <strong>and</strong> 2 are clear. To show is the exchange property.<br />
Regard two acyclic subsets A, B ⊆ E, |A| < |B|. Let V 1 , ... , V k ⊆ V be the<br />
connected components of G = (V , A).<br />
As |A| < |B| ≤ n − 1, A is not a spanning tree. Therefore, k ≥ 2.<br />
As B is acyclic, B may have at most |V i | − 1 edges within a component V i .<br />
A has exactly |V i | − 1 edges in every set V i<br />
k∑<br />
Therefore, B contains at most (|V i | − 1) = |A| edges within a<br />
i=1<br />
component V i .<br />
As |A| < |B|, there is an edge e ∈ B connecting two different components<br />
V i , V j . e cannot generate a cycle in A. It follows that A ∪ {e} ∈ U.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 12
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Pairwise different edge costs<br />
For simplicity, we assume pairwise different edge costs <strong>and</strong> eliminate this<br />
assumption later.<br />
Cut Property: An edge e is definitely part of an MST, if it is the edge with<br />
minimum cost in a cut induced by a non-trivial set S. Therefore, it might<br />
be safely inserted.<br />
Cycle Property: Let C be an arbitrary cycle in G, <strong>and</strong> let e be the edge of<br />
maximum cost in C. Then, no MST contains e. Therefore, it might be<br />
safely removed.<br />
Proof by means of an exchange property: Every edge e ′ in T that has not<br />
minimum cost with respect to the cut δ(S) might be replaced by an edge<br />
e ∈ δ(S) with c e < c e ′ yielding a spanning tree with less cost.<br />
Analogously done for the cycles.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 13
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Pairwise different edge costs (cont’d)<br />
We have assumed pairwise different edge costs for all edges in E.<br />
How is it possible to conclude that the algorithms are still correct<br />
even if some edge costs are identical?<br />
Idea: modify edge costs by very small amounts δ such that all edge<br />
costs are pairwise different.<br />
The modification must be small enough in order to not change the<br />
relative ordering of edges that originally had different costs.<br />
The modification is only used as a tie-breaker for identical edge costs.<br />
If a tree T is more expensive than a tree T ′ with respect to its original<br />
costs then it is more expensive for the modified costs <strong>and</strong> vice versa.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 14
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Duality of blue rule <strong>and</strong> red rule<br />
Observation 8<br />
The blue rule is the dual of the red rule <strong>and</strong> vice versa.<br />
Declarative Programming:<br />
∑<br />
minimize<br />
e∈E:e is blue<br />
c e<br />
s.t. {e ∈ E : e is blue} is a tree<br />
s.t.<br />
∑<br />
maximize<br />
e∈E:e is red<br />
c e<br />
{e ∈ E : e is not red} is a tree<br />
Note: These are not dual Linear Programs.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 15
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Blue rule or red rule?<br />
An MST contains n − 1 edges, usually this is a very small subset of the<br />
edges from E.<br />
Regarding a complete graph K n , there are ( n<br />
2)<br />
∈ O(n 2 ) edges.<br />
The blue rule labels exactly n − 1 edges blue ⇒ O(n).<br />
The red rule labels up to ( n<br />
2)<br />
− (n − 1) edges red ⇒ O(n 2 ).<br />
⇒ There are no efficient algorithms based on the red rule.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 17
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Borůvka’s Algorithm<br />
Borůvka’s algorithm is believed to be the first MST algorithm (1926). It uses<br />
exclusively the blue rule, but slightly modified.<br />
Borůvka(G, c)<br />
1. Initialize n trivial blue trees consisting of one node each.<br />
2. As long as there is more than one blue tree, for each of the blue trees<br />
choose the minimally incident edge <strong>and</strong> label it blue .<br />
3. The blue edges form an MST.<br />
Note that the above algorithm will only work with pairwise different edge<br />
weights c e . Otherwise cycles might occur. Clearly, specifying an ordering of<br />
edges of same weight with the above modification will help in this case.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 18
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Borůvka Example<br />
8<br />
10<br />
5<br />
5<br />
5<br />
2<br />
3<br />
2<br />
3<br />
2<br />
3<br />
18<br />
16<br />
16<br />
16<br />
12<br />
14<br />
30<br />
12<br />
4 26<br />
4<br />
4<br />
Blue edges are tree edges. A directed edge means the target component has<br />
been chosen from a source component during an iteration.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 19
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of Borůvka’s Algorithm<br />
Regard an arbitrary edge e = (v, w), that is inserted into T during an iteration of Borůvka’s<br />
algorithm.<br />
Let S be the node set that is accessible from v by edges from T shortly before e is inserted<br />
It is v ∈ S <strong>and</strong> w ∈ V \ S (otherwise a cycle would emerge by adding e).<br />
No edge from the cut between S und V \ S has been regarded so far. Otherwise, it would<br />
have been inserted without generating a cycle.<br />
As e is the cheapest edge from the cut δ(S), it is inserted correctly due to the cut property.<br />
The output T does not contain any cycles, because in Step 2 cycles are explicitly avoided.<br />
(V , T ) is connected: as G is connected, there would be at least one edge e in G between<br />
two different trees from (V , T ). The algorithm would have inserted that one with minimum<br />
cost.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 20
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Kruskal’s Algorithm (1956)<br />
Idea: Start with an empty edge set <strong>and</strong> successively add edges by increasing<br />
edge costs in such way no cycles emerge.<br />
Kruskal(G, c)<br />
1. Initialize n blue trees consisting of one node each.<br />
2. Sort E = {e 1 , e 2 , ... , e m } increasingly: c e1 ≤ c e2 ≤ · · · ≤ c em .<br />
3. FOR i = 1 TO m DO<br />
IF the endnodes u <strong>and</strong> v of e i = (u, v) are in the same tree<br />
THEN label e i red<br />
ELSE label e i blue<br />
4. The blue edges form an MST T .<br />
Alternative Step 3:<br />
FOR i = 1 TO m DO<br />
IF T ∪ {e i } does not contain a cycle THEN T := T ∪ {e i }<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 21
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Kruskal Example<br />
8<br />
10<br />
5<br />
8<br />
10<br />
5<br />
2<br />
3<br />
2<br />
3<br />
18<br />
16<br />
16<br />
12<br />
14<br />
30<br />
12<br />
14<br />
4 26<br />
4<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 22
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of Kruskal’s Algorithm<br />
Regard an arbitrary edge e = (v, w), that is inserted into T during an iteration of Kruskal’s<br />
algorithm.<br />
Let S be the node set that is accessible from v by edges from T shortly before e is inserted<br />
It is v ∈ S <strong>and</strong> w ∈ V \ S (otherwise a cycle would emerge by adding e).<br />
No edge from the cut between S und V \ S has been regarded so far. Otherwise, it would<br />
have been inserted without generating a cycle.<br />
As e is the cheapest edge from the cut δ(S), it is inserted correctly due to the cut property.<br />
The output T does not contain any cycles, because in every iteration cycles are explicitly<br />
avoided.<br />
(V , T ) is connected: as G is connected, there would be at least one edge e in G between<br />
two connected components from (V , T ). The algorithm would have inserted that one with<br />
minimum cost.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 23
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Reverse Kruskal<br />
Reverse-Kruskal(G, c)<br />
1. Label all edges in E blue . I.e. set (T := E).<br />
2. Sort E = {e 1 , e 2 , ... , e m } decreasingly: c e1 ≥ c e2 ≥ · · · ≥ c em .<br />
3. FOR i = 1 TO m DO<br />
IF T \ {e i } is connected THEN label e i red .<br />
4. The blue edges form an MST T .<br />
The above procedure is also called Dual Kruskal due to duality of the blue<br />
<strong>and</strong> red rule.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 24
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of Reverse Kruskal<br />
Regard an arbitrary edge e = (v, w) that is removed during the algorithm<br />
At the time of removal, e is part of a cycle C.<br />
Amongst all edges within C, e is the first edge to consider, that is e is the<br />
most expensive edge on C<br />
Due to the cycle property, the removal has been done deservedly.<br />
The output T of the algorithm T is connected due to the fact that at no<br />
time an edge is removed from T that would destroy connectivity (only<br />
edges from cycles are removed).<br />
In the end, T does not contain a cycle anymore because the most<br />
expensive edge on this cycle would have been removed.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 25
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Implementation of Kruskal’s Algorithm (1)<br />
Kruskal’s Algorithm has an obvious invariant:<br />
the edge set T reflects a forest in G at any time of the algorithm.<br />
Obviously, the time critical operation is the test, if a cycle emerges in T ∪ {e i } by the<br />
insertion of an edge e i into T .<br />
A cycle emerges if <strong>and</strong> only if for en edge e i = (u, v), the two nodes u <strong>and</strong> v are part of the<br />
same connected component of T .<br />
We already know techniques to efficiently compute the connected components of a graph:<br />
BFS <strong>and</strong> DFS.<br />
These have linear complexity O(m + n).<br />
If we had to compute the CCs in every iteration of Kruskal’s algorithm from scratch, this<br />
would amount to a total complexity of O(m 2 ).<br />
In order to avoid this, we would like to make use of a data structure that supports the<br />
operations of the algorithm efficiently.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 26
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Implementation of Kruskal’s Algorithm (2)<br />
We would like to make use of a data structure that supports the following<br />
operations efficiently:<br />
Given a set of nodes V of fixed size, <strong>and</strong> an edge set T that increases<br />
stepwise: in every step, an edge is inserted into T . At no time, an edge is<br />
removed from T .<br />
During the rise of T , we would like to know the connected components in<br />
every iteration. That is, for every node v ∈ V , we would like to compute its<br />
connected component efficiently.<br />
If we identify u <strong>and</strong> v to be part of two separate connected components, we<br />
would like to merge these components efficiently by insertion of an edge<br />
e = (v, w).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 27
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Complexity of MST Algorithms<br />
Sorting edges according to their weights: O(m log m) e.g. Quicksort.<br />
Testing for cycles amounts to the test whether two nodes are in different<br />
connected components, or not.<br />
Ideas:<br />
Test: for every node, get efficiently the connected component it belongs to.<br />
Merge: always move the component of smaller size into the component of<br />
larger size.<br />
Hence, every component (or single node) is merged into a component of at<br />
least twice the size. Hence, every node is moved between components at<br />
most log n times.<br />
Thus, we have O(n log n) for cycle testing, that is also O(m log m) (see<br />
below).<br />
Can be done more efficiently by advanced Disjoint-Set / Union-Find<br />
data structures (in nearly linear time).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 28
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Complexity of MST Algorithms (cont’d)<br />
As m > n − 2 in connected graphs holds, we can estimate n by m.<br />
As m < n 2 , it is log m < log n 2 = 2 log n = O(log n).<br />
Therefore, the total cost is O(m log m), or O(m log n), respectively.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 29
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Prim’s Algorithm<br />
Jarnik (1930), Prim (1956), Dijkstra (1959)<br />
Idea: Start from an arbitrary node. In every step, grow the tree by inserting<br />
the node producing minimal cost.<br />
Prim(G, c, s)<br />
1. Initialize n blue trivial trees (single nodes). Choose start node s.<br />
2. WHILE trivial blue trees exist DO<br />
choose an edge of minimal weight from the cut that is induced by<br />
the nodes of the tree that contains s. Label it blue .<br />
3. The blue edges form an MST T .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 30
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Correctness of Prim’s Algorithm<br />
During the WHILE loop, let S be the set of nodes, that are not trivial trees.<br />
In every iteration, the algorithms adds the edge e <strong>and</strong> a node w to the tree<br />
T such that c e is minimal from all edges e = (v, w) mit v ∈ S und w ∉ S.<br />
According to the cut property, e is part of any MST.<br />
It is easy to see that, eventually, a spanning tree is generated.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 31
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Implementation of Prim’s Algorithm<br />
In every iteration, it is crucial to efficiently decide which node to insert<br />
into the tree. (similar to Dijkstra’s Algorithm)<br />
Amongst all nodes that are not part of the tree, we search for the one that<br />
might be connected with the tree by an edge with minimum cost.<br />
Therefore, it is convenient to save for every node that is not part of the<br />
tree the best possible cost of an insertion.<br />
We save this value as a key in every node. If the node cannot be<br />
connected with the tree by means of a connecting edge, we set the key to<br />
+∞ (e.g. MAX_INT, etc.).<br />
In every iteration, we have to find a node that is not part of the tree with<br />
minimum key. This node is inserted.<br />
Then, the key values have to be updated.<br />
This task is supported by a specific data structure, a priority queue.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 32
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Priority queues<br />
A priority queue Q is an abstract data structure managing a set of objects of type T .<br />
Every element in Q has a key of an ordered number type NT (N, Z, ...)<br />
A priority queue supports the following operations:<br />
void create-pq(): create an empty priority queue Q<br />
void insert(T & x, NT y): insert the element x with key value y = key[x] into Q.<br />
T& find-min(): find the element x with the lowest key value <strong>and</strong> return reference to it.<br />
void delete-min(): remove the element with smallest key value from Q.<br />
void decrease-key(T x, NT y): decrease the key value of x to the new value Wert y.<br />
(Precondition: key[x] > y)<br />
bool empty(): returns TRUE if Q is empty.<br />
Annotation: Analogously, the data structure might deliver the element with the largest key value.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 33
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Implementation of Prim’s Algorithm<br />
Prim’s Algorithm using a priority queue Q<br />
Prim (G = (V , E), c, s)<br />
(1) FORALL v ∈ V DO<br />
(2) key[v] = ∞;<br />
(3) pred[v] = null; // predecessor of v in the MST<br />
(4) key[s] = 0;<br />
(5) PriorityQueue Q.create-pq();<br />
(6) FORALL v ∈ V DO<br />
(7) Q.insert (v, key[v]);<br />
(8) WHILE !Q.empty() DO<br />
(9) v = Q.find-min(); Q.delete-min();<br />
(10) FORALL u ∈ Adj[v] DO<br />
(11) IF u ∈ Q und c({v, u}) < key[u] THEN<br />
(12) pred[u] = v;<br />
(13) key[u] = c({v, u});<br />
(14) Q.decrease-key(u, key[u]);<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 34
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Complexity of Prim’s Algorithm<br />
lines (1)-(3): Initialization in O(n)<br />
lines (4) <strong>and</strong> (5) in O(1)<br />
lines (6) <strong>and</strong> (7): n insert-operations<br />
line (8): exactly n passes through the loop<br />
line (9): in total n find-min- und delete-min-operations<br />
lines (10) - (14): every edge is touched once, that is O(m) + m decrease-key-operations.<br />
line (11): the test if u ∈ Q can be done in O(1) with a boolean auxiliary array (must be<br />
updated upon insertion / deletion).<br />
If we implement the priority queue as a binary heap, the find-min-operations might be done<br />
in O(1), the insert-, delete-min- <strong>and</strong> decrease-key-operations in O(log n) each.<br />
In total, this amounts to a complexity of O(m log n) .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 35
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Round Robin Algorithm<br />
The Round Robin Algorithm is the most efficient approach to the computation<br />
of MSTs in sparse graphs. It is based on the blue rule <strong>and</strong> quite similar<br />
to Borůvka’s algorithm.<br />
Round-Robin(G, c)<br />
1. Initialize n blue trees consisting of one node each.<br />
2. As long as there are less than n − 1 blue edges, choose a blue tree T ′ ,<br />
compute the edge e ′ of minimum cost in δ(T ′ ) <strong>and</strong> label it blue .<br />
3. The blue edges form an MST.<br />
It can be implemented to run in O(m log log n): in every iteration, we choose<br />
the blue tree with the least number of nodes.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 36
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST Runtimes<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 37
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
MST Algorithms Complexity<br />
Deterministic comparison based MST algorithms<br />
Algorithm<br />
Complexity<br />
Jarnik, Prim, Dijkstra, Kruskal, Boruvka O(m log n)<br />
Cheriton-Tarjan (1976), Yao (1975) O(m log log n)<br />
Fredman-Tarjan (1987)<br />
O(mβ(m, n))<br />
Gabow-Galil-Spencer-Tarjan (1986) O(m log β(m, n))<br />
Chazelle (2000)<br />
O(mα(m, n))<br />
The holy grail would be O(m).<br />
Notable approaches<br />
Dixon-Rauch-Tarjan (1992)<br />
Karger-Klein-Tarjan (1995)<br />
O(m) verification<br />
O(m) r<strong>and</strong>omized<br />
On planar graphs possible in O(m).<br />
Parallel algorithms with linear number of processors in O(log n).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 38
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
An Application: 1-trees<br />
MSTs might be used for the computation of lower bounds on the optimal<br />
length of TSP tours. A Hamilton tour T is defined as follows:<br />
The degree of the start node w.r.t. T is 2.<br />
T is an MST for the nodes {2, 3, ... , n}.<br />
The degree of nodes {2, 3, ... , n} w.r.t. T is 2.<br />
Dropping the last condition, <strong>and</strong> computing a minimal edge set satisfying the<br />
first two conditions, we obviously get a lower bound on the length of a shortest<br />
tour. Such edge sets are also called 1-trees.<br />
Onetree(G, c)<br />
1. Compute an MST for the node set {2, 3, ... , n}. Let c T be its cost.<br />
2. Let e 1 be the shortest <strong>and</strong> e 2 be the second shortest edge in G incident<br />
with node 1.<br />
3. T ∪ {e 1 , e 2 } is an optimal 1-tree of value c T + c e1 + c e2 .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 39
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> – Part II<br />
The directed case: Maximum <strong>Branchings</strong><br />
Clearly, Minimum Branching would not make any sense with positive edge<br />
weights as the forests would be empty.<br />
Maximum Branching<br />
Given a digraph D = (V , A) with edge weights c e for each e ∈ A. We search<br />
for a branching B ⊆ A of D that maximizes ∑ c e .<br />
Recall: A branching B ⊆ A is an edge set such that in the directed acyclic<br />
graph (DAG) given by (V , B), it is |δ − (v)| ≤ 1 for all v ∈ V .<br />
Idea 1<br />
e∈B<br />
Sort edges by decreasing weight <strong>and</strong> use Kruskal’s algorithm:<br />
add edges of maximum weight always preserving the branching property<br />
(no cycles, |δ − (v)| ≤ 1 for all v ∈ B).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 40
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Maximum <strong>Branchings</strong><br />
3<br />
0 1 2 3<br />
2 2 2<br />
3<br />
0 1 2 3<br />
2<br />
0 1 2 3<br />
2 2 2<br />
Bad idea!<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 41
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Delayed Decision<br />
Idea 2: Reduce Complexity by Delayed Decision<br />
Fundamental technique for the solution of complex problems:<br />
Postpone critical decisions to a point in time you are able to make them<br />
irreversibly by<br />
transforming the original problem into easier problems by applying smart<br />
reduction techniques.<br />
generating a solution to the original problem from the (greedy) solutions<br />
of the reduced problems.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 43
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Edmonds’ Branching Algorithm<br />
Some terminology<br />
s : A → V determines the start node of an edge e ∈ A.<br />
t : A → V determines the end node of an edge e ∈ A.<br />
c : A → R determines the weight of an edge e ∈ A.<br />
u −→<br />
B<br />
v means: there is a directed u-v-path in B.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 44
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs<br />
4<br />
5<br />
1<br />
4<br />
7<br />
6<br />
4<br />
2<br />
5<br />
3<br />
0<br />
3<br />
2<br />
10<br />
8<br />
4<br />
3<br />
1<br />
5<br />
4<br />
6<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 45
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs (2)<br />
Definition 9 (Critical edge / subgraph)<br />
Let D = (V , A) be a directed graph with edge weights c e = c(e), ∀e ∈ A.<br />
(a) An edge e ∈ A is called critical, if c(e) > 0 <strong>and</strong><br />
t(e ′ ) = t(e) ⇒ c(e ′ ) ≤ c(e), for all e ′ ∈ A.<br />
(b) A subgraph H ⊆ D is called critical, if it consists exclusively of critical<br />
edges, <strong>and</strong> every node is an end node of at most one of these edges,<br />
<strong>and</strong> H is maximal by inclusion w.r.t. this property.<br />
Lemma 10<br />
An acyclic critical graph H is a maximal branching.<br />
Proof. Obviously, H is a branching. Let B be an arbitrary branching. For<br />
every node v ∈ V , we have c(B ∩ {e ∣ ∣ t(e) = v}) ≤ c(H ∩ {e<br />
∣ ∣ t(e) = v}).<br />
Summing up over v ∈ V , it is c(B) ≤ c(H).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 46
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs (3)<br />
Lemma 11<br />
Let H be a critical graph. Every node of H is part of at most one cycle.<br />
Proof. Consider an edge e. We assume there are 2 cycles containing e. Then<br />
there must be a node u with |δ − (u)| > 1. This contradicts the definition.<br />
Lemma 12<br />
Let B be a branching, <strong>and</strong> u, v, w three nodes. If u −→<br />
B<br />
v <strong>and</strong> w −→<br />
B<br />
v, then it<br />
is either u −→<br />
B<br />
w, or w −→<br />
B<br />
u.<br />
Proof. W.l.o.g., let u −→<br />
B<br />
v the shorter path. Then, w −→<br />
B<br />
v must contain all<br />
edges of this path. Otherwise, one node would be the end node of two edges.<br />
Therefore, it is w −→<br />
B<br />
u −→<br />
B<br />
v.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 47
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs (4)<br />
Definition 13<br />
Let B be a branching, <strong>and</strong> e /∈ B. e is called feasible w.r.t. B, if<br />
B ′ := B ∪ {e} \ {f ∣ ∣ f ∈ B <strong>and</strong> t(f ) = t(e)} is a branching as well.<br />
Lemma 14<br />
Let B be a branching, <strong>and</strong> e ∈ A \ B. Then, e is feasible w.r.t. B if <strong>and</strong> only if<br />
there is no path from t(e) to s(e) in B.<br />
Proof. If <strong>and</strong> only if the insertion of e generates a cycle, B ′ is not a branching.<br />
In this case, there must be a path from t(e) to s(e) in B.<br />
Lemma 15<br />
Let B be a branching, <strong>and</strong> C ⊆ A a cycle with the following property: there is<br />
no feasible edge from C \ B w.r.t. B. Then, it is |C \ B| = 1.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 48
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs (5)<br />
Proof.<br />
Clearly, |C \ B| > 0, because a branching does not contain any cycle.<br />
For contradiction, we assume that |C \ B| = k ≥ 2.<br />
Let C \ B = {e 1, e 2, ... , e k }, where the e i occur in this order within C.<br />
As there is no feasible e i , there are paths t(e i ) −→<br />
B<br />
s(e i ) (cf. Lemma 12).<br />
Upon assumption, we have t(e i−1 ) −→ s(e i ). From Lemma 10, it follows that<br />
B<br />
either t(e i−1 ) −→ t(e i ), or t(e i ) −→ t(e i−1 ).<br />
B B<br />
Let t(e i−1 ) −→<br />
B<br />
t(e i ) −→<br />
B<br />
s(e i ). The path t(e i−1 ) −→<br />
B<br />
s(e i ) is unique <strong>and</strong><br />
completely contained in C (by assumption).<br />
Therefore, it is t(e i−1 ) −→ t(e i ) −→ s(e i ). This is impossible as another edge<br />
B∩C B∩C<br />
e j ≠ e i would exist with end node t(e j ) = t(e i ) ∈ C as (s(e i ), t(e i )) is not part of<br />
that path.<br />
For this reason, it is t(e i ) −→ t(e i−1 ), i = 2, ... , k, <strong>and</strong> t(e 1) −→ t(e k ).<br />
B B<br />
Hence, B contains a cycle t(e k ) −→<br />
B<br />
t(e k−1 ) −→<br />
B<br />
... −→<br />
B<br />
t(e 1) −→<br />
B<br />
t(e k ).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 49
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs (6)<br />
Corollary 16<br />
For a branching B <strong>and</strong> a cycle C the following holds: either an edge e i ∈ C is<br />
feasible w.r.t. B, or B contains all edges of C but one.<br />
Theorem 17<br />
Let H be a critical graph. Then, there is a branching B of maximal weight,<br />
such that for any cycle C ⊆ H holds that |C \ B| = 1.<br />
Proof. Let B be a maximum branching that contains as many edges as possible from H.<br />
Consider a critical edge e ∈ H \ B:<br />
If there is no e ′ ∈ B with t(e) = t(e ′ ), we could increase the number of edges from H in B<br />
by simply adding e. This yields a branching B ′ containing more edges of H than B. Then<br />
either B was not maximal, or B ′ contains a circuit.<br />
If there is an e ′ ∈ B with t(e) = t(e ′ ): if e is feasible, exchanging e ′ by e would deliver a<br />
maximum branching B ′ with more edges from H than B has. This contradicts the above<br />
construction of B. It follows that there is no feasible edge e ∈ H \ B.<br />
In particular, for any cycle C ⊆ H, no edge e ∈ C \ B is feasible. With the above Lemma, it<br />
follows |C \ B| = 1 for any cycle C in H.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 50
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Graphs (7)<br />
In the following, we denote by C 1 , C 2 , ... , C k the (node disjoint) cycles of the<br />
critical graph H. From any of these cycles, we choose an edge of minimal<br />
weight a 1 , a 2 , ... , a k . By V i = V (C i ), we denote the nodes of cycle C i .<br />
Corollary 18<br />
Let D = (V , A) be a digraph with edge weights c e , <strong>and</strong> H a critical graph with<br />
cycles C i , ... , C k . Then, there is a maximum branching B such that:<br />
(a) |C i \ B| = 1, for all i = 1, ... , k.<br />
(b) If for any edge e ∈ B \ C i holds that t(e) /∈ V i , then it is C i \ B = {a i }.<br />
Proof. (a) holds (cf. Theorem 15). Let B be a maximal branching with the<br />
minimum number of edges a i . We show, that (b) follows if (a) holds:<br />
If (b) does not hold, there is a cycle C i with t(e) /∈ V i for all edges e ∈ B \ C i ,<br />
but a i ∈ B. Then, a i could be exchanged by any other edge from the cycle C i<br />
preserving maximality.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 51
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Critical Cycles 4 → 6 → 5 → 4, <strong>and</strong> 0 → 1 → 0<br />
4<br />
5<br />
1<br />
4<br />
7<br />
6<br />
4<br />
2<br />
5<br />
3<br />
0<br />
3<br />
2<br />
10<br />
8<br />
4<br />
3<br />
1<br />
5<br />
4<br />
6<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 52
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Shrinking critical cycles<br />
Let H be the critical graph of D with cycles C 1 , C 2 , ... , C k .<br />
Let Ṽ be the set of nodes outside any cycle.<br />
For any v ∈ V i , let ẽ(v) be the edge within C i with end node v.<br />
We construct a new instance in ¯D = ( ¯V , Ā) with functions ¯s, ¯t, <strong>and</strong> ¯c as<br />
follows:<br />
¯V := Ṽ ∪ {w 1, w 2 , ... , w k }, w i are pseudo-nodes replacing a cycle C i .<br />
Ā := A(Ṽ ) = A \ ⋃ k A(V (C i )) (edges from outside any cycle).<br />
i=1<br />
Edges are generated according to the following rules:<br />
¯s(e) :=<br />
{<br />
s(e), if s(e) ∈ Ṽ ,<br />
w i , if s(e) ∈ V i ,<br />
¯t(e) :=<br />
{<br />
t(e), if t(e) ∈ Ṽ ,<br />
w i , if t(e) ∈ V i ,<br />
{<br />
c(e), if t(e) ∈ Ṽ ,<br />
¯c(e) :=<br />
c(e) − c(ẽ(t(e))) + c(a i ) if t(e) ∈ V i .<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 53
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Shrinking Example<br />
6<br />
3<br />
1 2<br />
5<br />
2<br />
W2<br />
4<br />
W1<br />
4 5<br />
4<br />
7<br />
3 5 0<br />
7<br />
5<br />
There are two critical cycles: C 1 (blue), <strong>and</strong> C 2 (red).<br />
It is a 1 the edge (4, 3), <strong>and</strong> a 2 the edge (2, 0). Let e be the edge (2, 1).<br />
Then, ¯c(e) := c(e) − c(ẽ(t(e))) + c(a 1 ) = 3 − c(ẽ(1)) + 4 = 3 − 5 + 4 = 2.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 54
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Branching correspondence<br />
Let B be the set of branchings in the original graph D that satisfy (a) <strong>and</strong> (b)<br />
from Corollary 16. That is, these branchings contain all edges from critical<br />
cycles but one. If possible, the shortest edge is missing. These branchings<br />
are not necessarily optimal. For example, every branching in a DAG satisfies<br />
(a) <strong>and</strong> (b).<br />
Theorem 19<br />
There is a bijective transformation between B <strong>and</strong> the set of branchings in ¯D.<br />
In particular, a branching B ∈ B corresponds to a branching ¯B := B ∩ Ā in ¯D,<br />
<strong>and</strong> it holds<br />
k∑<br />
k∑<br />
c(B) − ¯c(¯B) = c(C i ) − c(a i ).<br />
Therefore, an optimal branching in ¯D is also optimal in D.<br />
i=1<br />
i=1<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 55
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Edmonds’ Algorithm (1967)<br />
Edmonds’ fundamental idea<br />
Shrink critical cycles <strong>and</strong> adjust the edge weights in the above manner<br />
until the graph is acyclic.<br />
Choose branching edges <strong>and</strong> extract pseudo nodes to the next higher<br />
level.<br />
Never revise choice of edges made in a lower level, instead choose<br />
edges within critical cycles according to that choice.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 56
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Edmonds’ Algorithm (1967)<br />
Edmonds(D, c)<br />
1. Compute a critical graph H ⊆ D.<br />
2. If H is acyclic, return H (H is an optimal branching).<br />
3. Shrink the cycles in H to obtain ¯D <strong>and</strong> ¯c.<br />
4. Recursively compute an opt. branching ¯B for ¯D by calling Edmonds( ¯D, ¯c)<br />
5. Exp<strong>and</strong> ¯B to an optimal branching B for D.<br />
An efficient implementation is not trivial. There are O(mn), O(m log n), <strong>and</strong><br />
O(m + n log n) implementations.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 57
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Example Phase 1: Shrink<br />
1) 2)<br />
W1<br />
4 -1 -1<br />
3<br />
10<br />
1<br />
1<br />
2<br />
3<br />
2<br />
8<br />
4 5 4<br />
7<br />
W2<br />
4<br />
5<br />
4<br />
4<br />
3<br />
3<br />
6<br />
4<br />
4<br />
1<br />
6<br />
3 -2<br />
7<br />
W3<br />
3<br />
5<br />
5<br />
6<br />
3)<br />
W4<br />
-1<br />
3<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 58
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Example Phase 2: Exp<strong>and</strong><br />
3) 2)<br />
W1<br />
4 3<br />
10<br />
1<br />
2<br />
4<br />
7<br />
W2<br />
3<br />
6<br />
7<br />
4<br />
3<br />
3<br />
1)<br />
W3<br />
6<br />
4<br />
5<br />
5<br />
6<br />
W4<br />
3<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 59
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Concluding Annotations<br />
Regarding parallel edges in the shrunk graph, only the one of highest<br />
weight is relevant.<br />
Zero or negative weight edges might be immediately eliminated.<br />
During expansion, edges choosen at a less exp<strong>and</strong>ed level are always<br />
kept (cf. Theorems 15,17).<br />
Application: Edmonds’ algorithm might be used to compute<br />
1-arborescences: Analogously to the 1-tree lower bound for the TSP, these<br />
might be used as lower bounds for the asymmetric TSP (ATSP).<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 60
<strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong><br />
Selected Literature<br />
Edmonds, J.: Optimum <strong>Branchings</strong>, J. Res. Nat. Bur. St<strong>and</strong>ards, vol.<br />
71B, 1967, 233–240.<br />
Tarjan, R. E.: Finding Optimum <strong>Branchings</strong>, Networks, v.7, 1977, 25–35.<br />
Karger, D. R., Klein, P. N., <strong>and</strong> Tarjan, R. E.: A r<strong>and</strong>omized linear-time<br />
algorithm to find minimum spanning trees, JACM 42, 2 (Mar. 1995),<br />
321–328.<br />
Chazelle, B.: A minimum spanning tree algorithm with inverse-Ackermann<br />
type complexity, JACM 47, 6 (Nov. 2000), 1028–1047.<br />
Pettie, S. <strong>and</strong> Ramach<strong>and</strong>ran, V.: An optimal minimum spanning tree<br />
algorithm, JACM 49, 1 (Jan. 2002), 16–34.<br />
Bader, D. A. <strong>and</strong> Cong, G.: Fast shared-memory algorithms for<br />
computing the minimum spanning forest of sparse graphs, J. Parallel<br />
Distrib. Comput. 66, 11 (Nov. 2006), 1366–1378.<br />
Efficient Graph Algorithms | Wolfgang Stille | WS 2011/2012 | <strong>Chapter</strong> III - <strong>Optimal</strong> <strong>Trees</strong> <strong>and</strong> <strong>Branchings</strong> | 61