Problem Detail: The max k-cut problem is: Given an undirected graph G= (V;E) with nonnegative edge costs, and an integer k, find a partition of V into sets $S_1,cdots,S_k$ so that the total cost of edges running between these sets is maximized. The aim is to find a greedy algorithm for this problem that achieves a factor of $(1-frac{1}{k})$.
My algorithm is: at the beginning, all the vertices of V is seperated in to $S_1,S_2,cdots,S_n$. Once we have $S_1,cdots,S_{t+1}$, we find $S_p,S_q$ s.t. $c(S_p,S_q)=min_{{i,j}}c(S_i,S_j)$ and merge $S_p,S_q$ together to get $t$ parts until $t=k$.
Let $ALG_t$ be the solution given by this algorithm for $t$ parts. $OPT_t$ is the optimum value for $t$ parts.
By induction, $ALG_n=OPT_nge(1-frac{1}{n})OPT_n$.
Since there are $binom{t+1}{2}$ sets of edges between $S_1,cdots,S_{t+1}$, at least one of these sets with cost $lefrac{1}{binom{t+1}{2}}ALG_{t+1}$.
By the construction of the algorithm, we merge these two sets of vertices together, then we have $ALG_tge ALG_{t+1}-frac{1}{binom{t+1}{2}}ALG_{t+1}=(1-frac{1}{binom{t+1}{2}})ALG_{t+1}ge(1-frac{1}{binom{t+1}{2}})(1-frac{1}{t+1})OPT_{t+1}ge(1-frac{1}{binom{t+1}{2}})(1-frac{1}{t+1})OPT_{t}$
But it’s not true that $(1-frac{1}{binom{t+1}{2}})(1-frac{1}{t+1})ge(1-frac{1}{t})$.
Therefore I wonder if there is some other arguments or algorithm?
My algorithm is: at the beginning, all the vertices of V is seperated in to $S_1,S_2,cdots,S_n$. Once we have $S_1,cdots,S_{t+1}$, we find $S_p,S_q$ s.t. $c(S_p,S_q)=min_{{i,j}}c(S_i,S_j)$ and merge $S_p,S_q$ together to get $t$ parts until $t=k$.
Let $ALG_t$ be the solution given by this algorithm for $t$ parts. $OPT_t$ is the optimum value for $t$ parts.
By induction, $ALG_n=OPT_nge(1-frac{1}{n})OPT_n$.
Since there are $binom{t+1}{2}$ sets of edges between $S_1,cdots,S_{t+1}$, at least one of these sets with cost $lefrac{1}{binom{t+1}{2}}ALG_{t+1}$.
By the construction of the algorithm, we merge these two sets of vertices together, then we have $ALG_tge ALG_{t+1}-frac{1}{binom{t+1}{2}}ALG_{t+1}=(1-frac{1}{binom{t+1}{2}})ALG_{t+1}ge(1-frac{1}{binom{t+1}{2}})(1-frac{1}{t+1})OPT_{t+1}ge(1-frac{1}{binom{t+1}{2}})(1-frac{1}{t+1})OPT_{t}$
But it’s not true that $(1-frac{1}{binom{t+1}{2}})(1-frac{1}{t+1})ge(1-frac{1}{t})$.
Therefore I wonder if there is some other arguments or algorithm?
Asked By : Connor
Answered By : Yuval Filmus
The natural greedy approach goes roughly like this:
Go over the vertices one by one. For each one, allocate it to a part in a way maximizing the total cost of edges cut.
There are at least two natural ways of defining total cost of edges cut:
- Total cost of edges between the vertex and vertices already allocated to different partitions.
- Same as 1, together with the expected cost of cut edges to vertices not already allocated.
In this case both ways are the same, but the second shows the connection between the greedy algorithm and the trivial randomized algorithm, which allocates the vertices completely randomly.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/54028 Ask a Question Download Related Notes/Documents