Finding the minimum cut in a graph using the Kruskal algorithm?

We have already seen that connecting trees and cuts are closely related. Here is another connection. Removes the last edge that the Kruskals algorithm adds to the spanning tree; this splits the tree into two components, thus defining a cut (S, S) in the graph. What can we say about this section? Suppose that the graph we were working with was unweighted and that its edges were ordered uniformly randomly for the Kruskals algorithm to process them. Here's a wonderful fact: with a probability of at least 1 / n ^ 2 (S, S) - the minimum cut in the graph, where the cut size (S, S) is the number of edges intersecting between S and S This means that the process repeats O ( n ^ 2) times and the conclusion of the smallest cut gives a minimal cut in G with a high probability: the O (mn ^ 2 log n) algorithm for unweighted minimal cuts. Further customization is provided by the O (n ^ 2 log n) minimum cut algorithm, invented by David Karger, which is the fastest known algorithm for this important problem.

  • Doesn’t it depend on the fact that there are n ^ 2 unique ways of processing a graph through the Kruskal algorithm? I mean, for the Kruskal algorithm, there are only 3 unique ways to process a graph with 10 nodes; repeating the process n ^ 2 times will not lead to the creation of n ^ 2 unique "last edges". How does this work in a scenario where there are less than n ^ 2 unique final cuts (which is less than n ^ 2 unique "last ribs")?

  • What if the total is less than n ^ 2 edges? For example, you may have a connected graph of 10 nodes with only 9 edges, which means no matter how many times you repeat the algorithm, you will not have n ^ 2 unique "last edges". How will this work in this situation?

  • Wouldn't it be easier to just scroll through each edge and check if the edge is a minimal cut? In a graph of n nodes, the maximum number of unique edges is n + n-1 + n-2 ... + 1 edges, which is less than n ^ 2. And given that n ^ 2 is less than n ^ 2 log n, why not just sneak all ribs as it is faster?

+4
source share
1 answer

I think you may misunderstand how the algorithm works. The algorithm works by running the Kruskal algorithm until the last edge is added, and then stops right before that. The algorithm does not attempt to create a collection of these "last ribs"; instead, it repeatedly performs O (n 2 ) random iterations of the Kruskal algorithm to create possible cuts O (n 2 ). Taking the lowest cut between all these candidate cuts, he gives the smallest cut with a high probability. In other words, it does not matter if the edges are O (n 2 ). What is important is the reduction that remains at the end, and not the last edge that was considered.

Hope this helps!

+4
source

All Articles