Why does R multicore use only one core?

Possible duplicate:
Parallel processing in R is limited

I wrote code in R multicore and I am running it on a 24-core machine. Actually, there are only 12 cores, but they are hyperthreads, so it looks like there are 24.

Here's the strange thing: all threads are running on the same core! Thus, each of them uses only a small number of processors, instead of each working on one core, and chews on all available kernels.

For simplicity, I just run 4 threads:

mclapply( 1:30, function(size) { # time consuming stuff that is cpu bound (think "forecast.ets" et al) }, mc.cores = 4, mc.preschedule = F ) 

Prior to starting this process, an R-process is already running on one core using 100% of this core capacity:

enter image description here

Then I start the "multi-core process", and 4 additional threads fight for the same core !:

enter image description here

... therefore, each of them receives 12% of one core, or about 1% of the available computing power, when each of them can get 100% of one core. In addition, another R process now only receives 50% of the kernel.

OS - Ubuntu 12.04 64-bit. Hardware is Intel. R - version 2.15.2 "trick or treat"

Thoughts? (I know that I can just use snowfall, but I have a lot of variables, and I really don't want all sfExport everything!)

Edit: oh, I assume there is some kind of global castle somewhere? But still, why did the conflict arise between two completely separate processes R? I can run two R processes in parallel, simply, each of which accepts 100% of the main processor.

Edit2: Thanks to the Dirk pointer, I restored openblas, and now it looks much healthier:

enter image description here

+7
source share
1 answer

A possible problem is a possible side effect of the OpenBLAS package, which establishes affinity for the processor, so that the processes are tied to one core. See Parallel processing in R is limited for discussion and links for a more detailed discussion of the r-sig-hpc hotfix list.

+8
source

All Articles