Recently, I encountered this problem when testing: considering the set of points m (all on the x axis) and the set of n lines with ends [l, r] (again on the x axis), we find the minimal subset n such that all points are covered by a line. Prove that your solution always finds the smallest subset.
The algorithm I wrote for it was something like: (for example, strings are stored as arrays with the left endpoint at position 0 and right at position 1)
algorithm coverPoints(set[] m, set[][] n): chosenLines = [] while m is not empty: minX = min(m) bestLine = n[0] for i=1 to length of n: if n[i][0] <= minX and n[i][1] > bestLine[1] then bestLine = n[i] add bestLine to chosenLines for i=0 to length of m: if m[i] <= bestLine[1] then delete m[i] from m return chosenLines
I'm just not sure if this always finds a minimal solution. This is a simple greedy algorithm, so my gut tells me that it is not, but one of my friends, who is much better than me, says that for this problem, a greedy algorithm like this always finds a minimal solution. In order to prove that mine always finds a minimal solution, I made a very manual wave proof in contradiction, when I made the assumption that this is probably not the case. I definitely forgot what I did.
If this is not a minimal solution, is there a way to do this in less than something like O (n!) Time?
thanks
algorithm greedy
Sean
source share