# bipartite and tripartite graphs

• May 23rd, 2009, 06:26 AM
gandalf23
bipartite and tripartite graphs
Hello, this is my first post!

Please have a look at this problem:

--->
Bip-Dominating-Set (BDS): Let G(L,R,E) be a bipartite graph and k a positive integer. Is there a subset S of L s.t. |S|<=k and S dominates R?

Trip-Dominating-Set (TDS): Let G(L,C,R,E) be a tripartite graph and k a positive integer. Is there a subset S of L s.t. |S|<=k and S dominates both C and R?

Prove that if TDS is in P, then BDS is in P.

EDIT: I forgot something important!!!
A set X of vertices dominates another set Y of vertices iff
for each y in Y there exists a vertex x in X such that there is an edge x->y.

In other words, If there are vertices in Y that aren't reached by edges starting from vertices in X, then X doesn't dominate Y.

Let f:A->B be a function. A dominates B iff f is surjective. Just joking :-)
<---

There is a painfully obvious reduction so maybe I'm missing something.
If G(L,R,E) is a bipartite graph, isn't G(L,0,R,E) a tripartite graph (where 0 is the empty graph)?
If empty graphs are not acceptable, I may just add a vertex v and all the edges from the vertices in L to v. The new graph would be G(L,{v},R,E U E') where E' is the set of the new vertices.
Then S (Yes/No) is the solution to BDS(G(L,R,E)) if and only if S is the solution to TDS(G(L,{v},R,E U E')).

Gandalf
• May 23rd, 2009, 07:12 PM
R0jkumar
Your reduction seems correct to me. The straight forward reduction shouldnt be surprising since the problem of finding a dominating set in a triparite graph is as atleast as hard as finding a dominating set in a biparite graph or maybe harder. Given that fact, if TDS is in P, it should be obvious that BDS is also in P since BDS is no harder than TDS.

This would be similar to reductions in the case of NP-Complete problems, where you have problems A and B and A is alteast as hard as B. If A is the optimization problem for vertex cover, and B is the decision problem for vertex cover, then A is atleast as hard as B. So, if B is NP-Hard, so is A. Also, if A turns out to be in P, surely B is in P.

RK
• May 24th, 2009, 04:27 AM
gandalf23
Quote:

Originally Posted by R0jkumar
This would be similar to reductions in the case of NP-Complete problems, where you have problems A and B and A is alteast as hard as B. If A is the optimization problem for vertex cover, and B is the decision problem for vertex cover, then A is atleast as hard as B. So, if B is NP-Hard, so is A. Also, if A turns out to be in P, surely B is in P.
RK

Indeed, I'm studying NP problems.

NP theory also says that B is at least as hard as A. That's why we usually study decision problems instead of optimization problems.
Let's prove it! (This proof should be similar to a proof I saw for the TSP.)

Let Opt be the optimization problem and Dec the decision problem.
Let assume we can solve Dec in polynomial time.
Let Opt(G) e Dec(G,k) be the solutions to the two problems for G and k.
Dec(G,k) = Yes <=> |Opt(G)|<=k

Let V be the number of vertices in G.

1) We can find |Opt(G)| by solving some instances of Dec.
It's clear that 0<=|Opt(G)|<=V
If Dec(G,V/2) = Yes
|Opt(G)| is in [0,V/2]
else
|Opt(G)| is in [V/2+1,V]
With the bisection method we can find |Opt(G)| in log(V) steps, so we call Dec only log(V) times.

Now we know the cardinality of the optimum solution.

2) Let v be a vertex in G. We create a new graph G' by adding a new vertex w and an edge e(v,w). The edge e can only be covered by v or w.
If v is in Opt(G) then Opt(G)=Opt(G') because v also covers e.
Opt(G)=Opt(G') <=> Dec(G',|Opt(G)|) = Yes
If v is NOT in Opt(G) then the solution Opt(G') must include v or w because there's no other way to cover the new edge e. But neither v nor w are in G so |Opt(G')|>|Opt(G)|. So Dec(G',|Opt(G)|) = No.
Perfect! We can tell if v is in at least one optimum solution (there might be more then one).
If v was in Opt(G) we choose another vertex v_2 and add a new vertex w_2 to G' and so on... We work on G' because we want to see if v_2 is in at least one of the solutions which also contain v. We stop when we find |Opt(G)| vertices.
If v is NOT in Opt(G) we ignore v and choose another vertex from G.

We conclude that Opt <= Dec.
• May 24th, 2009, 10:05 PM
R0jkumar
Log(|V|) steps
I want to add this regarding Part I of your solution. I am not sure about the log(|V|) for the number of steps. Consider an example of a graph where the number of vertices in the graph is 16 and the minimum vertex cover is 7. So, we do this

- Step 1 : try the decision algo with 8 vertices first, find there is a vertex cover of size 8
- Setp 2 : try the decision aglo with 4 vertices and find there is no such vertex cover
- Step 3: One option now is to go up the recursion tree. You will have to try for a vertex cover of 6 and then find that there is no such vertex cover.
- Step 4 : Try for a vertex cover of size 7 now and find there is one!!! You are done because there is a vertex cover of size 7 and there is no vertex cover of size 6. Hence that is your optimum size vertex cover.

In other words, i am not sure how your recursion bottoms out and how you know when to stop. I believe that you can only stop when know that there is a vertex cover of size p and there is no vertex cover of size p-1.
• May 25th, 2009, 05:18 AM
gandalf23
Re: Log(|V|) steps
Quote:

Originally Posted by R0jkumar
In other words, i am not sure how your recursion bottoms out and how you know when to stop. I believe that you can only stop when know that there is a vertex cover of size p and there is no vertex cover of size p-1.

That's exactly what happens in a binary search in a sorted vector.

Let's suppose we must find the smallest value i in {1,...,n} such that Dec(G,i) = true.
If n is a power of two, then we use N=n.
If n is not a power of two then there is an integer k such that
n = 2^k + r
where 0<r<2^k.
In this case we let N=2^(k+1).
Let's find the value i in {1,...,N} such that Dec(G,i) = true. Since n<N, the set {1,...,N} contains the right value.
Because at each step we halve our interval and N = 2^(k+1), after k+1 steps we are left with a single element. Our reasoning behind splitting the interval is correct, so that element must be the element we were looking for.
Since N<2n, log(N)<log(n)+1 = O(logn)
• May 25th, 2009, 10:25 AM
R0jkumar
Recursion bottoming out
Yeah. You are right. For a minute, i thought the algo stops as soon as you find that the decision algo return true but that isnt the case. And yes, this is similar to binary search in a sorted vector where you are looking for the lowest index element that is an exact match.

RK