The fastest way to find the minimum distance from one point to a point on a curve

I am looking for a quick solution to the following problem:

illustration of minimum distance problem

I have a fixed point (say, top right on the white measuring line), and I need to find the closest point on the curve, consisting of points located at the same distance (lower curve). In addition, I do this for each point on the top curve to draw the distances between the curves in different colors (three levels: below the minimum [red], between the minimum and maximum [orange] and above the maximum [green]).

My current solution is a compromise: I take a fixed point, iterate over an arbitrary interval (for example, 50 units to the left and right of a fixed point) and calculate the distance of each pair. This saves some processor power, but is neither elegant nor accurate, as I could skip the minimum distance beyond the selected interval.

Any suggestions for a faster algorithm?

Edit: Equal spacing means that all points have the same distance along the X axis, this is true for both curves. Also I do not need to interpolate between points, it will be too much time.

+7
source share
3 answers

We can denote the upper curve y = t (x) and the lower curve y = b (x). Designate the closest function x_b = c (x_t). We know that the nearest function is weakly monotonic, not decreasing, since the two shortest paths never intersect with each other.

If you know that the distance function d (x_t, x_b) has only one local minimum for each fixed x_t (this happens if the curve is “fairly smooth”), you can save time by “passing” the curve:

- start with x_t=0, x_b=0 - while x_t <= x_max -- find the closest x_b by local search (increment x_b while the distance is decreasing) -- add {x_t, x_b} to the result set -- increment x_t 

If you expect x_b to be smooth enough, but you cannot assume that you want the exact result,

Follow the curve in both directions. When the results are consistent, they are correct. Where they do not agree, do a full search between the two results (leftmost and rightmost local maxima). Sample an “ambiguous block” in this order (binary division) to allow for large trimming due to monotony.

As an intermediate level:

Follow the curve in both directions. If the results do not match, select one of them. If you can guarantee a maximum of two local maximums for each fixed x_t, this gives an optimal solution. There are some more pathological cases when the optimal solution is not found and contains a local minimum, which is surrounded by two other local minimums, which are worse than this. I dare to say that it is unusual to find a case where the solution is far from optimal (assuming smooth y = b (x)).

+2
source

Instead of an arbitrary distance, you could iterate to “out of range”.

In your example, suppose you start with a point on the upper curve in the upper right corner of your line. Then falling vertically down, you will get a distance (in my eye) of about 200um.

Now you can go from here to test points until the horizontal distance becomes 200um. In addition, it is impossible to obtain a distance of less than 200 m.

Moving to the left, the distance reaches a minimum minimum of 150 m, and then starts to grow again. Once you are 150% to the left of your high point, again, it is impossible to beat the minimum that you found.

If you went ahead, you would not have to go that far, since the optimization either followed the direction in which it fell, or worked from the middle in both directions at the same time.

I do not know how many um are 50 units, so it can be slower or faster than yours. However, it avoids the risk of losing a lower value.

Since you do many tests against the same set of points on the bottom curve, you can improve this by ignoring the fact that the points form a curve in general. Paste them all into a kd tree or the like, and search again. He called the closest search for a neighbor .

+3
source

This can help identify this problem as the problem of finding the closest neighbor . This link contains a good discussion of the various algorithms that are used for this. If you're fine using C ++ rather than direct C, ANN looks like a good library for this.

It seems that this question was before .

+3
source

All Articles