minimum spanning trees

43
Minimum Spanning Trees Problem Description Why Compute MST? MST in Unweighted Graphs MST in Weighted Graphs Kruskal’s Algorithm Prim’s Algorithm 1

Upload: scarlet-shaffer

Post on 31-Dec-2015

48 views

Category:

Documents


4 download

DESCRIPTION

Minimum Spanning Trees. Problem Description Why Compute MST? MST in Unweighted Graphs MST in Weighted Graphs Kruskal’s Algorithm Prim’s Algorithm. Spanning Tree: Definition. A Spanning tree = a subset of edges from a connected graph that: - PowerPoint PPT Presentation

TRANSCRIPT

Minimum Spanning Trees• Problem Description• Why Compute MST?• MST in Unweighted Graphs• MST in Weighted Graphs

– Kruskal’s Algorithm– Prim’s Algorithm

1

2

Spanning Tree: Definition• A Spanning tree = a subset of edges from a

connected graph that:– touches all vertices in the graph (spans the graph)– forms a tree (is connected and contains no cycles)

A

B C

D

A weighted graph

Three spanning trees

• Minimum Spanning Tree: Spanning tree with the least total edge cost

5 9

2

1

7 4

A

B C

D

2

47

A

B C

D

91

7

A

B C

D7 4

9

3

MST Problem• Given a weighted, undirected graph G=(V, E),

compute the minimum cost spanning tree– MST may not be unique (unless all edge weights are

distinct)

a

b c

h g f

d

ei

4

8 7

914

10

42

711

8

1

6

2

Cost = 37

a

b c

h g f

d

ei

4

8 7

914

10

42

711

8

1

6

2

Cost = 37

Why Compute MST?• Minimize length of gas pipelines between

cities

• Find cheapest way to wire a house (with minimum cable)

• Find a way to connect various routers on a network that minimizes total delay

• Eliminate loops in a switched LAN so that broadcast packets will not circle around indefinitely

5

Some Basic Facts about Free Trees

• Notice: An MST is a free tree:

• A free tree with “n” vertices has exactly “n-1” edges

• There exists a unique path between any two vertices of a free tree

• Adding any edge to a free tree creates a unique cycle. Breaking any edge on this cycle restores a free tree

6

Computing MST–Unweighted Graphs

• What if the graph is unweighted or all edge weights are equal?

– Simply run BFS or DFS and the resulting tree is a MST

A

B C

ED

BFS(A)

A

B CE

D

A

BC

ED

DFS(A)

7

Computing MST – Weighted Graphs

• We will present two greedy algorithms for computing MSTs in weighted graphs– Kruskal’s Algorithm– Prim’s Algorithm

• A greedy algorithm− always makes choices that currently seem the

best− Short-sighted – no consideration of long-term or

global issues− Locally optimal does not always mean globally

optimal. but works in some cases, e.g., MST, shortest-paths, Huffman coding, …

8

Greedy MST Algorithms• Let G = (V, E) be an undirected, connected

graph whose edges have numeric edge weights, which may be positive, negative or zero

• The intuition behind the greedy algorithms is simple:

• Maintain a subset of edges A, initially empty• Add edges to A one-by-one until A equals the MST

1. A <- EmptySet2. One-by-one add a “safe” edge to A, until A equals the MST

9

When is an Edge Safe?• Let S be a subset of the vertices S <= V.• A cut (S, V-S) is just a bipartition of the

vertices into two disjoint subsets• An edge (u, v) crosses the cut if one

endpoint is in S and the other is in V-S.

a

b c

h g f

d

eiS V-S

10

MST Lemma• Let G = (V, E) be a connected, undirected graph

with real-valued weights on the edges. • Let (S, V-S) be any cut • Let (u, v) be an edge that crosses this cut & has

the minimum weight, i.e., (u, v) is the light edge

• Then edge (u, v) is a safe edge

a

b c

h g f

d

eiS V-S

11

MST Lemma: Proof

u

x

v

y

4

8

A

u

x

v

y

4

8

u

x

v

y

4

T + (u,v) T’ = T – (x,y) + (u,v)

9

76

• Let us assume that all edge weights are distinct• Let T denote the MST

– If T contains (u, v) we are done– If not, there must be another edge (x, y) that crosses the cut and is part of

the MST– Let us now add (u, v) to T, thus creating a cycle

– Now remove (x, y). We get another spanning tree, call it T’

12

MST Lemma: Proof

• We have w(T’) = w(T) – w(x, y) + w(u, v)• Since (u, v) is the lightest edge crossing the cut, we

have w(u, v) < w(x, y). Thus w(T’) < w(T) contradicting the assumption that T was an MST

u

x

v

y

4

8

A

u

x

v

y

4

8

u

x

v

y

4

T + (u,v) T’ = T – (x,y) + (u,v)

9

76

13

Kruskal’s Algorithm

• Kruskal’s Algorithm works by attempting to add edges to A in increasing order of weight (lightest edges first).

• If the next edge does not induce a cycle among the current set of edges, then it is added to A

• If it does, then this edge is passed over, and we consider the next edge in order

Generic MST Algorithm1. A <- EmptySet2. One-by-one add a “safe” edge to A, until A equals the MST

14

Kruskal’s Algorithm

Kruskal(G = (V, E)){A = {}; // Initially A is emptySort E in increasing order by weight w;

for each ((u, v) from the sorted list){

if (adding (u, v) does not induce a cycle in A){ Add (u, v) to A;

} //end-if} //end-for

return A; // A is the MST} //end-Kruskal

15

Kruskal’s Algorithm: Example

a

b c

h g f

d

ei

4

8

914

10

42

711

8

1

6

2

7

(h, g) (i, c) (g, f) (a, b) (c, f) (i, g) (i, h)

(c, d) (a, h)

(b, c) (d, e) (e, f) (b, h)

Sorted Edge List

(h, g) (i, c) (g, f) (a, b) (c, f) (i, g) (i, h)

(c, d) (a, h)

(b, c) (d, e) (e, f) (b, h)

(d, f)(d, f)

16

Kruskal’s Algorithm: Example

• As this algorithm runs, the edges of A will induce a forest on the vertices.

• As the algorithm continues, the trees of this forest are merged together, until we have a single tree containing all the vertices

a

b c

h g f

d

ei

4

8

914

10

42

711

8

1

6

2

7

17

Kruskal’s Algorithm: Correctness• Observe that the strategy leads to a correct

algorithm. Why?– Consider the edge (u, v) that Kruskal’s algorithm

seeks to add next and suppose that this edge does not induce a cycle in A

– Let A’ denote the tree of the forest A that contains vertex u

– Consider the cut (A’, V-A’)– Every edge crossing the cut is not in A’, and (u,

v) is the light edge across the cut (because any lighter edge would have been considered earlier by the algorithm)

– Thus by the MST Lemma, (u, v) is safe

18

Kruskal’s Algo: ImplementationKruskal(G = (V, E)){

A = {}; // Initially A is emptySort E in increasing order by weight w;

for each ((u, v) from the sorted list){

if (adding (u, v) does not induce a cycle in A){ Add (u, v) to A;

} //end-if} //end-for

return A; // A is the MST} //end-Kruskal

• The only tricky part in the algorithm is how to detect whether the addition of an edge will create a cycle in A

How to detectthe cycle?

19

Kruskal’s Algo: Implementation• This can be done by the disjoint set (Union-

Find) data struc., which supports 3 operations– CreateSet(u): Create a set containing a single item u– Find(u): Find the set that contains a given item u– Union(u, v): Merge the set containing u and the set

containing v into a common set

– It is sufficient to know that each of these operations can be performed in O(logn) time

– In fact, there are faster implementations

20

Kruskal’s Algo: Implementation• In Kruskal’s Algorithm, the vertices of the

graph will be the elements to be stored in the sets

• The sets will be vertices in each tree of A

• The set A can be stored as a simple list of edges

21

Kruskal’s Algo: Final Version

Kruskal(G = (V, E)){A = {}; // Initially A is emptyfor each (u in V)

CreateSet(u); // Create a set for each vertexSort E in increasing order by weight w; // O(eloge):

// e <= n^2 loge < 2lognfor each ((u, v) from the sorted list){ // O(elogn)

if (Find(u) != Find(v)) { // if u and v are in Add (u, v) to A; // different treesUnion(u, v);

} //end-if} //end-for

return A;} //end-Kruskal

22

Kruskal’s Algorithm: Example

a

b c

h g f

d

ei

4

8

914

10

42

711

8

1

6

2

7(h, g)

(i, c)

(g, f)

(a, b)

(c, f)

(i, g)

(i, h)

(c, d)

(a, h)(b, c)

(d, e)

(e, f)

(b, h)

(h, g)

(i, c)

(g, f)

(a, b)

(c, f)

(i, g)

(i, h)

(c, d)

(a, h)(b, c)

(d, e)

(e, f)

(b, h)(d, f)(d, f)

a b d e g hc if g

h

i

c

g

hf

a

b g

hf

i

cd g

hf

i

ca

b

d g

hf

i

ca

b

d g

hf

i

ce

23

Prim’s Algorithm• Prim’s Algorithm is another greedy

algorithm for MST

• Differs from Kruskal’s Algorithm only in how it selects the next “safe edge” to add at each step

24

Why study Prim’s Algorithm?• 2 reasons to study Prim’s Algorithm

– To show that there is more than one way to solve a problem• An important lesson to learn in algorithm design

– Prim’s algorithm looks very much like another Greedy Algorithm, called Dijkstra’s Algorithm used to compute shortest paths• Thus not only Prim’s algorithm is a different way to

solve a different problem, it is also the same way to solve a different problem

25

Prim’s Algorithm: Pseudocode

Prim(G = (V, E)) { Start with a root vertex “r” (any vertex in the graph)

A = {r}; for (i=1; i<=n-1; i++) { 1. Consider the cut (A, V-A) 2. Let (u, v) be the light edge that crosses the cut such that u Є A & v Є V-A 3. Add v to A, i.e., A = A U {v};

} //end-for

return A; // A is the MST} //end-Prim

26

Prim’s Algo: Growing the Tree

r

u

12

10

6

7

11

4

5

Current Tree Vertices (A)

r

u

12

10

6

7

3

59

After “u” is added

u

27

Prim’s Algo: Growing the Tree• Observe that we consider the set of vertices

A current part of the tree, and its complement (V-A)

• We have a cut of the graph (A, V-A)– Which edge should we add next?– MST Lemma tells us that it is safe to add the light

edge

28

Prim’s Algorithm: Example

4

8

914

10

42

711

8

1

6

2

7

29

Prim’s Algo: ImplementationPrim(G = (V, E)) {

A = {r}; for (i=1; i<=n-1; i++) { 1. Consider the cut (A, V-A) 2. Let (u, v) be the light edge that crosses the cut such that u Є A & v Є V-A 3. Add v to A, i.e., A = A U {v};

} //end-for

return A; // A is the MST} //end-Prim

• The key question in the efficient implementation of Prim’s algorithm is – how to update the cut efficiently– how to determine the light edge quickly

30

Prim’s Algo: Implementation - IPrim(G = (V, E)) {

for all u in V do color[u] = white;

color[r] = black;

for (i=1; i<=n-1; i++) {

min = INFINITY; // cost of the light edge

(x, y) = (?, ?); // light edge

for all (u, v) in E do {

if (color[u] == black && color[v] == white &&

w(u,v) < min){

(x, y) = (u, v); // (u, v) is current light edge

min = w(u, v);

} //end-if

} //end-for

color[y] = black; // Add y to A

} //end-for

return A; // A is the MST

} //end-Prim

Running Time?

O(e)

n times

O(nxe)

31

Prim’s Algo: Implementation - II• For faster implementation, we will make use

of your favorite DS, the priority queue or a heap.– Recall that a heap stores a set of items, where

each item is associated with a key value, and supports 3 operations (all can be implemented in O(logn))

– Insert(Q, u, key): Insert u with key value key in Q– u = Extract_Min(Q): Extract the item with the

minimum key value in Q– Decrease_Key(Q, u, new_key): Decrease the

value of u’s key value to new_key

32

Prim’s Algo: Implementation• What do we store in the priority queue?• The idea is the following:

– For each vertex in u V-A (i.e. not part of the current spanning tree), we associate u with a key value key[u], which is the weight of the lightest edge going from u to any vertex in A

– We also store in pred[u] the end of this edge in A• If there is no edge from u to a vertex in V-A, then we set

its key value to +infinity

– We also need to know which vertices are in A• We do this by coloring the vertices in A black

– Here is the algorithm…

33

Prim’s Algo: ImplementationPrim(G, w, r){

For each (u in V) { // Initializationkey[u] = +infinity;

color[u] = white;} //end-for

key[r] = 0; // Start at rootPred[r] = nil;Q = build initial queue with all vertices;while (Non_Empty(Q)){ // Until all vertices in MST

u = Extract_Min(Q); // Vertex with lightest edgefor each (v in Adj[u]) { if (color[v] == white && (w(u, v) < key[v])){ key[v] = w(u, v); // New lighter edge out of v Decrease_Key(Q, v, key[v]); pred[v] = u; } //end-if} //end-forcolor[u] = black;

} //end-while[prev pointers define the MST as an inverted tree rooted at r]

} //end-Prim

Running Time?

n times

O(nlogn + elogn)

O(logn)

O(logn)

e times

34

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

7∞

∞ ∞

∞0

4

8

0

4

8

9

14

10

4

2

711

8

1

6

2

74

8

∞ ∞

35

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

8

∞ ∞

4 8

0

4

8

9

14

10

4

2

711

8

1

6

2

74

8

8

∞ ∞

36

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

8

8

∞ ∞

8 7

4

2

0

4

8

9

14

10

4

2

711

8

1

6

2

74

8

8

2

∞ 4

7

37

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

8

8

2

∞ 4

7

∞2

67

0

4

8

9

14

10

4

2

711

8

1

6

2

74

7

8

2

6 4

7

10

38

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

7

8

2

6 4

7

42

10

0

4

8

9

14

10

4

2

711

8

1

6

2

74

7

8

2

2 4

7

10

39

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

7

8

2

2 4

7

10

21

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

10

40

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

10

1

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

10

41

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

10

7

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

9

9

42

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

99

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

9

43

Prim’s Algorithm: Example

0

4

8

9

14

10

4

2

711

8

1

6

2

74

1

8

2

2 4

7

9

0

4

8

9

4

2

1 2

74

1

8

2

2 4

7

9

prev pointers define the MST as an inverted tree rooted at r