graph sparsifiers by edge-connectivity and random spanning trees nick harvey u. waterloo department...

26
Graph Sparsifiers by Edge-Connectivity and Random Spanning Trees Nick Harvey U. Waterloo Department of Combinatorics and Optimization Joint work with Isaac Fung

Post on 19-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Graph Sparsifiers byEdge-Connectivity and

Random Spanning Trees

Nick HarveyU. Waterloo

Department of Combinatorics and Optimization

Joint work with Isaac Fung

• What is the max flow from s to t?

6 45

6 4

3 75

7 3

5

5

s t

• What is the max flow from s to t?• The answer in this graph is the same:

it’s a Gomory-Hu tree.• What is capacity of all edges incident on u?

15 1515

15 15

10s t15

u

Can any dense graph be“approximated” by a sparse graph?

• Approximating pairwise distances– Spanners: number of edges = O(n1+2/®),

distance approximated to within ®. [ADDJS’93,...,P’09]

– Low-stretch trees: number of edges = n-1,“most” distances approximated to within log n. [FRT’04]

• Approximating all cuts– Sparsifiers: number of edges = O(n log n /²2) ,

every cut approximated within 1+². [BK’96]

• Spectral approximation– Spectral sparsifiers: number of edges = O(n log n /²2),

entire spectrum approximated within 1+². [SS’08]

[BSS’09]

[BSS’09]

n = # vertices

What is the point of all this?• Approximating pairwise distances– Spanners:• Network routing• Motion planning• Etc.

– Low stretch / congestion trees:• Approximating metrics by simpler metrics• Approximation algorithms• Online algorithms

What is the point of all this?• Approximating all cuts– Sparsifiers: fast algorithms for cut/flow problem

Problem Approximation Runtime ReferenceMin st Cut 1+² O~(n2) BK’96

Sparsest Cut O(log n) O~(n2) BK’96Max st Flow 1 O~(m+nv) KL’02Sparsest Cut O~(n2) AHK’05Sparsest Cut O(log2 n) O~(m+n3/2) KRV’06Sparsest Cut O~(m+n3/2+²) S’09

Perfect Matching in Regular Bip. Graphs

n/a O~(n1.5) GKK’09

Sparsest Cut O~(m+n1+²) M’10

v = flow value

n = # verticesm = # edges

What is the point of all this?• Spectral approximation– Spectral sparsifiers: solving diagonally-dominant linear

systems in nearly linear time!

Connections to metric embeddings, Banach spaces...

Problem Runtime ReferenceComputing Fiedler Vector O~(m) ST’04Computing Effective Resistances O~(m) SS’08Sampling Random Spanning Trees KM’09Max st Flow O~(m4/3) CKMST’10Min st Cut O~(m+n4/3) CKMST’10

Graph Sparsifiers:Formal problem statement

• Design an algorithm such that• Input: An undirected graph G=(V,E)• Output: A weighted subgraph H=(V,F,w),

where FµE and w : F ! R• Goals:• | |±G(U)| - w(±H(U)) | · ² |±G(U)| 8U µ V• |F| = O(n log n / ²2)• Running time = O~( m / ²2 )

# edges between U and V\U in Gweight of edges between U and V\U in H

n = # verticesm = # edges

| |±(U)| - w(±(U)) | · ² |±(U)| 8U µ V

Why should sparsifiers exist?• Example: G = Complete graph Kn

• Sampling: Construct H by sampling every edge of Gwith probability p=100 log n/n

• Properties of H:• # sampled edges = O(n log n)• Standard fact: H is connected• Stronger fact: p|±G(U)| ¼ |±H(U)| 8U µ V

• Output:– H with each edge given weight 1/p– By this, H is a sparsifier of G

• Chernoff Bound:Let X1,X2,... be {0,1} random variables.Let X = i Xi and let ¹ = E[ X ].For any ±2[0,1], Pr[ |X-¹| ¸ ±¹ ] · 2 exp( -±2¹ / 3 ).

• Consider any cut ±G(U) with |U|=k. Then |±G(U)|¸kn/2.

• Let Xe = 1 if edge e is sampled. Let X = e2C Xe = |±H(U)|.

• Then ¹ = E[X] = p |±(U)| ¸ 50 k log n.• Say cut fails if |X-¹| ¸ ¹/2.• So Pr[ cut fails ] · 2 exp( - ¹/12 ) · n-4k.• # of cuts with |U|=k is .• So Pr[ any cut fails ] · k n-4k < k n-3k < n-2.• So, whp, every U has ||±H(U)| - p |±(U)|| < p |±(U)|/2.

Chernoff BoundBound on # small cuts

Key Ingredients

Union bound

Generalize to arbitrary G?

• Can’t sample edges with same probability!• Idea [BK’96]

Sample low-connectivity edges with high probability, and high-connectivity edges with low probability

Keep this

Eliminate most of these

Non-uniform sampling algorithm [BK’96]

• Input: Graph G=(V,E), parameters pe 2 [0,1]• Output: A weighted subgraph H=(V,F,w),

where FµE and w : F ! R

For i=1 to ½For each edge e2E

With probability pe,Add e to F

Increase we by 1/(½pe)

• Main Question: Can we choose ½ and pe’sto achieve sparsification goals?

Non-uniform sampling algorithm [BK’96]

• Claim: H perfectly approximates G in expectation!• For any e2E, E[ we ] = 1

) For every UµV, E[ w(±H(U)) ] = |±G(U)|

• Goal: Show every w(±H(U)) is tightly concentrated

• Input: Graph G=(V,E), parameters pe 2 [0,1]• Output: A weighted subgraph H=(V,F,w),

where FµE and w : F ! R

For i=1 to ½For each edge e2E

With probability pe,Add e to F

Increase we by 1/(½pe)

Prior Work• Benczur-Karger ‘96– Set ½ = O(log n), pe = 1/“strength” of edge e

(max k s.t. e is contained in a k-edge-connected vertex-induced subgraph of G)

– All cuts are preserved– e pe · n ) |F| = O(n log n)– Running time is O(m log3 n)

• Spielman-Srivastava ‘08– Set ½ = O(log n), pe = “effective resistance” of edge e

(view G as an electrical network where each edge is a 1-ohm resistor)

– H is a spectral sparsifier of G ) all cuts are preserved– e pe = n-1 ) |F| = O(n log n)– Running time is O(m log50 n)– Uses matrix-valued concentration inequalities

Assume ² is constant

O(m log3 n)[Koutis-Miller-Peng ’10]

Similar to edge connectivity.

Our Work• Fung-Harvey ’10 (and independently Hariharan-Panigrahi ‘10)

– Set ½ = O(log2 n), pe = 1/edge-connectivity of edge e(min size of a cut that contains e)

– All cuts are preserved– e pe · n ) |F| = O(n log2 n)– Running time is O(m log2 n)– Advantages: Edge connectivities natural, easy to compute,

Faster than previous algorithms,Also implies sampling by strength / effective resistances works.

– Disadvantages: Extra log factor, no spectral sparsification.

• Panigrahi ’10– A sparsifier with O(n log n /²2) edges, with running time

O(m) in unwtd graphs and O(m)+O~(n/²2) in wtd graphs

Assume ² is constant

Our Work• Alternative Algorithm– Let H be union of ½ uniformly random spanning trees of G,

where we is 1/(½¢(effective resistance of e))– All cuts are preserved– |F| = O(n log2 n)– Running time is

• Motivation– Perhaps using random spanning trees leads to a sparsifier with

O(n) edges?

• Negative result– ( log n ) spanning trees needed

Assume ² is constant

~O(mp

n)

• Notation: kuv = min size of a cut separating u and v• Main ideas:– Partition edges into connectivity classes

E = E1 [ E2 [ ... Elog n where Ei = { e : 2i-1·ke<2i }

• Notation: kuv = min size of a cut separating u and v• Main ideas:– Partition edges into connectivity classes

E = E1 [ E2 [ ... Elog n where Ei = { e : 2i-1·ke<2i }– Prove weight of sampled edges that each cut

takes from each connectivity class is about right

– This yields a sparsifier

U

Prove weight of sampled edges that each cuttakes from each connectivity class is about right

• Notation:• C = ±(U) is a cut • Ci = ±(U) Å Ei is a cut-induced set

• Need to prove:

C1 C2 C3 C4

• Notation: Ci = ±(U) Å Ei is a cut-induced set

C1 C2 C3 C4

Prove 8 cut-induced set Ci

• Key Ingredients• Chernoff bound: Prove small• Bound on # small cuts: Prove

#{ cut-induced sets Ci induced by a small cut |C| }is small.

• Union bound: sum of failure probabilities is small, so probably no failures.

Counting Small Cut-Induced Sets• Theorem: Let G=(V,E) be a graph. Fix any BµE.

Suppose ke¸K for all e in B. (kuv = min size of a cut separating u and v)

Then, for every ®¸1,|{ ±(U) Å B : |±(U)|·®K }| < n2®.

• Corollary: Counting Small Cuts [K’93]Let G=(V,E) be a graph.Let K be the edge-connectivity of G. (i.e., global min cut value)

Then, for every ®¸1,|{ ±(U) : |±(U)|·®K }| < n2®.

Comparison• Theorem: Let G=(V,E) be a graph. Fix any BµE.

Suppose ke¸K for all e in B. (kuv = min size of a cut separating u and v)

Then |{ ±(U) Å B : |±(U)|·c }| < n2c/K 8c¸1.

• Corollary [K’93]: Let G=(V,E) be a graph.Let K be the edge-connectivity of G. (i.e., global min cut value)

Then, |{ ±(U) : |±(U)|·c }| < n2c/K 8c¸1.

• How many cuts of size 1?Theorem says < n2, taking K=c=1.Corollary, says < 1, because K=0.

(Slightly unfair)

Algorithm For Finding Needle in Haystack

• Input: A haystack• Output: A needle (maybe)

• While haystack not too small–Pick a random handful– Throw it away

• End While• Output whatever is left

Algorithm for Finding a Min Cut [K’93]

• Input: A graph• Output: A minimum cut (maybe)• While graph has 2 vertices “Not too small”–Pick an edge at random “Random Handful”–Contract it “Throw it away”

• End While• Output remaining edges

• Claim: For any min cut, this algorithm outputs it with probability ¸ 1/n2.

• Corollary: There are · n2 min cuts.

Finding a Small Cut-Induced Set• Input: A graph G=(V,E), and BµE• Output: A cut-induced subset of B• While graph has 2 vertices– If some vertex v has no incident edges in B• Split-off all edges at v and delete v

–Pick an edge at random–Contract it

• End While• Output remaining edges in B• Claim: For any min cut-induced subset of B, this

algorithm outputs it with probability >1/n2.• Corollary: There are <n2 min cut-induced subsets of B

Conclusions• Graph sparsifiers important for fast algorithms and some

combinatorial theorems• Sampling by edge-connectivities gives a sparsifier

with O(n log2 n) edges in O(m log2 n) time– Improvements: O(n log n) edges in O(m) + O~(n) time

[Joint with Hariharan and Panigrahi]

• Sampling by effective resistances also works ) sampling O(log2 n) random spanning trees

gives a sparsifier.

Questions• Improve log2 n to log n?• Other ways to get sparsifiers with O(n) edges?