you can use as.vector() . It seems like this is the fastest method according to my small benchmark, as follows:
library(microbenchmark) x=matrix(runif(1e4),100,100)
The first solution uses as.vector() , the second uses the fact that the matrix is stored as a continuous array in memory, and length(m) gives the number of elements in the matrix m . The third instance of array from x , and the fourth uses the concatenate c() function. I also tried unmatrix from gdata , but it is mentioned too slowly here.
Here are some of our numerical results:
> microbenchmark( y<-as.vector(x), y<-x[1:length(x)], y<-array(x), y<-c(x), times=1e4) Unit: microseconds expr min lq mean median uq max neval y <- as.vector(x) 8.251 13.1640 29.02656 14.4865 15.7900 69933.707 10000 y <- x[1:length(x)] 59.709 70.8865 97.45981 73.5775 77.0910 75042.933 10000 y <- array(x) 9.940 15.8895 26.24500 17.2330 18.4705 2106.090 10000 y <- c(x) 22.406 33.8815 47.74805 40.7300 45.5955 1622.115 10000
Matrix smoothing is a common operation in Machine Learning, where the matrix can represent parameters for study, but uses an optimization algorithm from a common library that expects a parameter vector. Therefore, it is customary to convert a matrix (or matrices) into such a vector. This is the case with the standard optim() R function.
David Bellot Sep 08 '17 at 16:18 2017-09-08 16:18
source share