We can denote the upper curve y = t (x) and the lower curve y = b (x). Designate the closest function x_b = c (x_t). We know that the nearest function is weakly monotonic, not decreasing, since the two shortest paths never intersect with each other.
If you know that the distance function d (x_t, x_b) has only one local minimum for each fixed x_t (this happens if the curve is “fairly smooth”), you can save time by “passing” the curve:
- start with x_t=0, x_b=0 - while x_t <= x_max -- find the closest x_b by local search (increment x_b while the distance is decreasing) -- add {x_t, x_b} to the result set -- increment x_t
If you expect x_b to be smooth enough, but you cannot assume that you want the exact result,
Follow the curve in both directions. When the results are consistent, they are correct. Where they do not agree, do a full search between the two results (leftmost and rightmost local maxima). Sample an “ambiguous block” in this order (binary division) to allow for large trimming due to monotony.
As an intermediate level:
Follow the curve in both directions. If the results do not match, select one of them. If you can guarantee a maximum of two local maximums for each fixed x_t, this gives an optimal solution. There are some more pathological cases when the optimal solution is not found and contains a local minimum, which is surrounded by two other local minimums, which are worse than this. I dare to say that it is unusual to find a case where the solution is far from optimal (assuming smooth y = b (x)).
John dvorak
source share