close

Вход

Log in using OpenID

embedDownload
Chapter 7
Approximation Algorithms
CS 573: Algorithms, Fall 2014
September 16, 2014
7.0.0.1
Today’s Lecture
Don’t give up on NP-Hard problems:
(A) Faster exponential time algorithms: nO(n) , 3n , 2n , etc.
(B) Fixed parameter tractable.
(C) Find an approximate solution.
7.1
7.1.0.2
Greedy algorithms and approximation algorithms
Greedy algorithms
(A) greedy algorithms: do locally the right thing...
(B) ...and they suck.
VertexCoverMin
Instance: A graph G.
Question: Return the smallest subset S ⊆ V (G), s.t. S touches all the edges of G.
(C) GreedyVertexCover:
pick vertex
with highest degree, remove, repeat.
1
7.1.1
Greedy algorithms
7.1.1.1
GreedyVertexCover in action...
Observation 7.1.1. GreedyVertexCover returns 4 vertices, but opt is 3 vertices.
7.1.1.2
Good enough...
Definition 7.1.2. In a minimization optimization problem, one looks for a valid solution that minimizes
a certain target function.
(A) VertexCoverMin: Opt(G) = minS⊆V (G),S cover of G |S|.
(B) VertexCover(G): set realizing sol.
(C) Opt(G): value of the target function for the optimal solution.
Definition 7.1.3. Alg is α-approximation algorithm for problem Min, achieving an approximation
α ≥ 1, if for all inputs G, we have:
Alg(G)
≤ α.
Opt(G)
7.1.1.3
Back to GreedyVertexCover
(A) GreedyVertexCover:
pick vertex
with highest degree, remove, repeat.
(B) Returns 4, but opt is 3!
(C) Can not be better than a 4/3-approximation algorithm.
(D) Actually it is much worse!
2
7.1.1.4
How bad is GreedyVertexCover?
Build a bipartite graph.
Let the top partite set be of size n.
In the bottom set add bn/2c vertices of degree 2, such
that each edge goes to a different vertex above.
Repeatedly add bn/ic bottom vertices of degree i,
for i = 2, . . . , n.
Repeatedly add bn/ic bottom vertices of degree i,
for i = 2, . . . , n.
Repeatedly add bn/ic bottom vertices of degree i,
for i = 2, . . . , n.
Repeatedly add bn/ic bottom vertices of degree i,
for i = 2, . . . , n.
3
Bottom row has
7.1.1.5
Pn
i=2 bn/ic
= Θ(n log n) vertices.
How bad is GreedyVertexCover?
(A) Bottom row taken by Greedy.
(B) Top row was a smaller solution.
Lemma 7.1.4. The algorithm GreedyVertexCover is Ω(log n) approximation to the optimal
solution to VertexCoverMin.
See notes for details!
7.1.1.6
Greedy Vertex Cover
Theorem 7.1.5. The greedy algorithm for VertexCover achieves Θ(log n) approximation, where n
(resp. m) is the number of vertices (resp., edges) in the graph. Running time is O(mn2 ).
Proof Lower bound follows from lemma.
Upper bound follows from analysis of greedy algorithm for Set Cover, which will be done shortly.
As for the running time, each iteration of the algorithm takes O(mn) time, and there are at most n
iterations.
4
7.1.1.7
Two for the price of one
ApproxVertexCover(G):
S←∅
while E(G) 6= ∅ do
uv ← any edge of G
S ← S ∪ {u, v}
Remove u, v from V(G)
Remove all edges involving u or v from E(G)
return S
Theorem 7.1.6. ApproxVertexCover is a 2-approximation algorithm for VertexCoverMin that runs
in O(n2 ) time.
Proof...
7.1.1.8
7.2
Two for the price of one - example
Fixed parameter tractability, approximation, and fast exponential time algorithms (to say nothing of the dog)
7.2.1
A silly brute force algorithm for vertex cover
7.2.1.1
What if the vertex cover is small?
(A)
(B)
(C)
(D)
G = (V, E) with n vertices
K ← Approximate VertexCoverMin up to a factor of two.
Any vertex cover of G is of size ≥ K/2.
Naively compute optimal in O nK+2 time.
5
7.2.1.2
Induced subgraph
Definition 7.2.2. Let G = (V, E) be a graph. For
a subset S ⊆ V, let GS be the induced subgraph over S.
Definition 7.2.1. NG (v): Neighborhood of v –
set of vertices of G adjacent to v.
v
NG (v)
7.2.2
Exact fixed parameter tractable algorithm
7.2.2.1
Fixed parameter tractable algorithm for VertexCoverMin.
Computes minimum vertex cover for the induced graph GX :
6
fpVCI (X, β)
// β: size of VC computed so far.
if X = ∅ or GX has no edges then return β
e ← any edge uv of GX .
β1 = fpVCI X \ {u, v} , β + 2
β2 = fpVCI X \ {u} ∪ NGX (v) , β + |NGX (v)|
β3 = fpVCI X \ {v} ∪ NGX (u) , β + |NGX (u)|
return min(β1 , β2 , β3 ).
algFPVertexCover (G = (V, E))
return fpVCI(V, 0)
7.2.2.2
Depth of recursion
Lemma 7.2.3. The algorithm algFPVertexCover returns the optimal solution to the given instance
of VertexCoverMin.
Proof...
u
7.2.2.3
v
Depth of recursion II
Lemma 7.2.4. The depth of the recursion of algFPVertexCover(G) is at most α, where α is the
minimum size vertex cover in G.
Proof: (A) When the algorithm takes both u and v - one of them in opt. Can happen at most α times.
(B) Algorithm picks NGX (v) (i.e., β2 ). Conceptually add v to the vertex cover being computed.
(C) Do the same thing for the case of β3 .
(D) Every such call add one element of the opt to conceptual set cover. Depth of recursion is ≤ α.
7.2.3
Vertex Cover
7.2.3.1
Exact fixed parameter tractable algorithm
Theorem 7.2.5. G: graph with n vertices. Min vertex cover of size α. Then, algFPVertexCover
returns opt. vertex cover.
Running time is O(3α n2 ).
7
Proof:
(A) By lemma, recursion tree has depth α.
(B) Rec-tree contains ≤ 2 · 3α nodes.
(C) Each node requires O(n2 ) work.
Algorithms with running time O(nc f (α)), where α is some parameter that depends on the problem
are fixed parameter tractable.
7.3
Traveling Salesperson Problem
7.3.0.2
TSP
TSP-Min
Instance: G = (V, E) a complete graph, and ω(e) a cost function on edges of G.
Question: The cheapest tour that visits all the vertices of G exactly once.
Solved
exactly
naively
2 n
Using DP, solvable in O(n 2 ) time.
7.3.0.3
≈
in
n!
time.
TSP Hardness
Theorem 7.3.1. TSP-Min can not be approximated within any factor unless NP = P.
Proof.
(A) Reduction from Hamiltonian Cycle into TSP.
(B) G = (V, E): instance of Hamiltonian cycle.
(C) H: Complete graph over
 V.
1 uv ∈ E
∀u, v ∈ V wH (uv) =
2 otherwise.
(D) ∃ tour of price n in H ⇐⇒ ∃ Hamiltonian cycle in G.
(E) No Hamiltonian cycle =⇒ TSP price at least n + 1.
(F) But... replace 2 by cn, for c an arbitrary number
7.3.0.4
TSP Hardness - proof continued
Proof: (A) Price of all tours are either:
(i) n (only if ∃ Hamiltonian cycle in G),
(ii) larger than cn + 1 (actually, ≥ cn + (n − 1)).
(B) Suppose you had a poly time c-approximation to TSP-Min.
(C) Run it on H:
(i) If returned value ≥ cn + 1 =⇒ no Ham Cycle since (cn + 1)/c > n
(ii) If returned value ≤ cn =⇒ Ham Cycle since OP T ≤ cn < cn + 1
(D) c-approximation algorithm to TSP =⇒ poly-time algorithm for NP-Complete problem.
Possible only if P = NP.
8
7.3.1
TSP with the triangle inequality
7.3.1.1
Because it is not that bad after all.
TSP46= -Min
Instance: G = (V, E) is a complete graph. There is also a cost function ω(·) defined over the
edges of G, that complies with the triangle inequality.
Question: The cheapest tour that visits all the vertices of G exactly once.
triangle inequality: ω(·) if
∀u, v, w ∈ V(G) ,
ω(u, v) ≤ ω(u, w) + ω(w, v) .
Shortcutting σ: a path from s to t in G =⇒ ω(st) ≤ ω(σ).
7.3.2
TSP with the triangle inequality
7.3.2.1
Continued...
Definition 7.3.2. Cycle in G is Eulerian if it visits every edge of G exactly once.
Assume you already seen the following:
Lemma 7.3.3. A graph G has a cycle that visits every edge of G exactly once (i.e., an Eulerian cycle)
if and only if G is connected, and all the vertices have even degree. Such a cycle can be computed in
O(n + m) time, where n and m are the number of vertices and edges of G, respectively.
7.3.3
TSP with the triangle inequality
7.3.3.1
Continued...
(A) Copt optimal TSP tour in G.
(B) Observation: ω(Copt ) ≥ weight cheapest spanning graph of G .
(C) MST: cheapest spanning graph of G.
ω(Copt ) ≥ ω(MST(G))
(D) O(n log n + m) = O(n2 ): time to compute MST. n = |V(G)|, m = n2 .
7.3.4
TSP with the triangle inequality
7.3.4.1
2-approximation
(A)
(B)
(C)
(D)
(E)
(F)
(G)
T ← MST(G)
H ← duplicate very edge of T .
H has an Eulerian tour.
C: Eulerian cycle in H.
ω(C) = ω(H) = 2ω(T ) = 2ω(M ST (G)) ≤ 2ω(Copt ).
π: Shortcut C so visit every vertex once.
ω(π) ≤ ω(C)
9
7.3.5
TSP with the triangle inequality
7.3.5.1
2-approximation algorithm in figures
s
u
s
u
D
C
v
C
w
(a)
(b)
Euler
First
Shortcut String:
Tour:
occurrences:
v
w
(c)
(d)
vuvwvsv
vuvwvsv
vuwsv
7.3.6
TSP with the triangle inequality
7.3.6.1
2-approximation - result
Theorem 7.3.4. G: Instance of TSP46= -Min.
Copt : min cost TSP tour of G.
=⇒ Compute a tour of G of length ≤ 2ω(Copt ).
Running time of the algorithm is O(n2 ).
G: n vertices, cost function ω(·) on the edges that comply with the triangle inequality.
7.3.7
TSP with the triangle inequality
7.3.7.1
3/2-approximation
Definition 7.3.5. G = (V, E), a subset M ⊆ E is a matching if no pair of edges of M share endpoints.
A perfect matching is a matching that covers all the vertices of G.
w: weight function on the edges. Min-weight perfect matching, is the minimum weight matching
among all perfect matching, where
X
ω(M ) =
ω(e) .
e∈M
7.3.8
TSP with the triangle inequality
7.3.8.1
3/2-approximation
The following is known:
Theorem 7.3.6. Given a graph G and weights on the edges, one can compute the min-weight perfect
matching of G in polynomial time.
10
7.3.8.2
Min weight perfect matching vs. TSP
Lemma 7.3.7. G = (V, E): complete graph.
S ⊆ V: even size.
ω(·): a weight function over E.
=⇒ min-weight perfect matching in GS is ≤ ω(TSP(G))/2.
S
π
S
π
σ
S
π
σ
11
7.3.8.3
A more perfect tree?
4
3
5
7
2
(A)
(B)
(C)
(D)
(E)
(F)
How to make the tree Eulerian?
Pesky odd degree vertices must die!
Number of odd degree vertices in a graph is even!
Compute min-weight matching on odd vertices, and add to MST.
H = MST + min − weight − matching is Eulerian.
Weight of resulting cycle in H ≤ (3/2)ω(TSP).
7.3.8.4
1
6
Even number of odd degree vertices
Lemma 7.3.8. The number of odd degree vertices in any graph G0 is even.
Proof: µ = v∈V (G0 ) d(v) = 2|E(G0 )| and thus even.
P
U = v∈V (G0 ),d(v) is even d(v) even too.
Thus,
X
α=
d(v) = µ − U = even number,
P
v∈V,d(v) is odd
since µ and U are both even.
Number of elements in sum of all odd numbers must be even, since the total sum is even.
7.3.9
3/2-approximation algorithm for TSP
7.3.9.1
Animated!
7.3.10
3/2-approximation algorithm for TSP
7.3.10.1
The result
Theorem 7.3.9. Given an instance of TSP with the triangle inequality, one can compute in polynomial
time, a (3/2)-approximation to the optimal TSP.
7.3.10.2
Biographical Notes
The 3/2-approximation for TSP with the triangle inequality is due to ?.
12
1/--pages
Пожаловаться на содержимое документа