Multidimensional spline interpolation in python / scipy?

Is there a library module or another easy way to implement multidimensional spline interpolation in python?

In particular, I have a set of scalar data on a regularly distributed three-dimensional grid, which I need to interpolate at a small number of points scattered throughout the domain. For two dimensions, I use scipy.interpolate.RectBivariateSpline , and I'm mostly looking for an extension of this for 3D data.

The found N-dimensional interpolation procedures are not good enough: I would prefer splines over the LinearNDInterpolator for smoothness, and I have too many data points (often more than one million), for example, a radial base function to work with.

If someone knows a python library that can do this, or maybe one in another language that I could name or port, I would really appreciate it.

+25
python numpy scipy interpolation
Jun 04 '11 at 17:21
source share
2 answers

If I understand your question correctly, are your โ€œobservationโ€ inputs regularly snapped to the grid?

If so, scipy.ndimage.map_coordinates does exactly what you want.

This is a little difficult to understand on the first pass, but, in fact, you just pass it the sequence of coordinates into which you want to interpolate the grid values โ€‹โ€‹in the coordinates pixel / voxel / n-dimension-index.

As a 2D example:

 import numpy as np from scipy import ndimage import matplotlib.pyplot as plt # Note that the output interpolated coords will be the same dtype as your input # data. If we have an array of ints, and we want floating point precision in # the output interpolated points, we need to cast the array as floats data = np.arange(40).reshape((8,5)).astype(np.float) # I'm writing these as row, column pairs for clarity... coords = np.array([[1.2, 3.5], [6.7, 2.5], [7.9, 3.5], [3.5, 3.5]]) # However, map_coordinates expects the transpose of this coords = coords.T # The "mode" kwarg here just controls how the boundaries are treated # mode='nearest' is _not_ nearest neighbor interpolation, it just uses the # value of the nearest cell if the point lies outside the grid. The default is # to treat the values outside the grid as zero, which can cause some edge # effects if you're interpolating points near the edge # The "order" kwarg controls the order of the splines used. The default is # cubic splines, order=3 zi = ndimage.map_coordinates(data, coords, order=3, mode='nearest') row, column = coords nrows, ncols = data.shape im = plt.imshow(data, interpolation='nearest', extent=[0, ncols, nrows, 0]) plt.colorbar(im) plt.scatter(column, row, c=zi, vmin=data.min(), vmax=data.max()) for r, c, z in zip(row, column, zi): plt.annotate('%0.3f' % z, (c,r), xytext=(-10,10), textcoords='offset points', arrowprops=dict(arrowstyle='->'), ha='right') plt.show() 

enter image description here

To do this in n dimensions, we just need to pass arrays of the appropriate size:

 import numpy as np from scipy import ndimage data = np.arange(3*5*9).reshape((3,5,9)).astype(np.float) coords = np.array([[1.2, 3.5, 7.8], [0.5, 0.5, 6.8]]) zi = ndimage.map_coordinates(data, coords.T) 

As for scaling and memory usage, map_coordinates will create a filtered copy of the array if you use order> 1 (i.e. not linear interpolation). If you just want to interpolate at a very small number of points, this is a pretty big cost. However, it does not increase with the number of points at which you want to interpolate. If you have enough RAM for one temporary copy of the input data array, everything will be fine.

If you cannot keep a copy of your data in memory, you can either: a) specify prefilter=False and order=1 and use linear interpolation, or b) replace the original data with the filtered version using ndimage.spline_filter , and then call map_coordinates with prefilter prefilter=False .

Even if you have enough RAM, saving a filtered dataset can be a big boost if you need to call map_coordinates several times (e.g. interactive use, etc.).

+43
Jun 04 2018-11-11T00:
source share

It is difficult to implement smooth spline interpolation in dim> 2, and therefore there are few libraries available that can do this (I really don't know).

You can try inverse distance weighted interpolation, see Backward Distance Interpolation (IDW) with Python . This should produce reasonably smooth results and scale better than RBF for large datasets.

+2
Jun 04 '11 at 18:13
source share



All Articles