You can use gdalUtils::gdalwarp for this. It is less effective for me than @JosephWood fasterAgg.Fun for rasters with 1,000,000 cells, but for Joseph, a more detailed example is much faster. This requires that the raster exist on disk, so the write time of the coefficient is lower if your raster file is in memory.
Below I used the modification fasterAgg.Fun , which returns the most frequent value, and not its index in the block.
library(raster) x <- matrix(rpois(10^8, 2), 10000) a <- raster(x) fasterAgg.Fun <- function(x,...) { myRle.Alt <- function (x1) { n1 <- length(x1) y1 <- x1[-1L] != x1[-n1] i <- c(which(y1), n1) x1[i][which.max(diff(c(0L, i)))] } if (sum(x)==0) { return(NA) } else { myRle.Alt(sort(x, method="quick")) } } system.time(a2 <- aggregate(a, fact=10, fun=fasterAgg.Fun))
Note that there is a slight difference in the definition of the mode when there are links: gdalwarp selects the highest value, and the functions passed to the aggregate above (through the behavior of which.max ) select the lowest (for example, see which.max(table(c(1, 1, 2, 2, 3, 4))) ).
It is also important to store raster data as integers (if applicable). If the data is stored as a float ( writeRaster by default), for example, the gdalwarp operation above takes about 14 seconds on my system. See ?dataType for available types.
source share