I have a grid of ocean depth data by location, and I'm trying to interpolate depth values ββto select GPS points.
We used RSAGA :: pick.from.points, which is great for small datasets.
require(RSAGA) depthdata <- cbind.data.frame(x=c(74.136, 74.135, 74.134, 74.133, 74.132, 74.131, 74.130, 74.129, 74.128, 74.127), y=rep(40, times=10), depth=c(-0.6, -0.6, -0.9, -0.9, -0.9, -0.9, -0.9, -0.9, -0.6, -0.6)) mylocs <- rbind(c(-74.1325, 40), c(-74.1305, 40)) colnames(mylocs) <- c("x", "y") results <- pick.from.points(data=mylocs, src=depthdata, pick=c("depth"), method="nearest.neighbour") mydepths <- results$depth
But our depth dataset contains 69 million data points, and we have 5 million GPS points that we need to estimate the depth, and pick.from points just drag out (> 2 weeks) for this dataset. We believe that we could It would be faster to complete this task in MATLAB or ArcMap, but we are trying to include this task in a longer workflow in R, which we write for other people to work multiple times, so switching to proprietary software for the part that the workflow is less desirable.
We would like to sacrifice some degree of accuracy for speed.
I looked for the solution as best as possible, but I'm pretty new to grid and interpolation data, so I might use the wrong language and therefore skip the simple solution.