Possible duplicate:
Parallel processing in R is limited
I wrote code in R multicore and I am running it on a 24-core machine. Actually, there are only 12 cores, but they are hyperthreads, so it looks like there are 24.
Here's the strange thing: all threads are running on the same core! Thus, each of them uses only a small number of processors, instead of each working on one core, and chews on all available kernels.
For simplicity, I just run 4 threads:
mclapply( 1:30, function(size) {
Prior to starting this process, an R-process is already running on one core using 100% of this core capacity:

Then I start the "multi-core process", and 4 additional threads fight for the same core !:

... therefore, each of them receives 12% of one core, or about 1% of the available computing power, when each of them can get 100% of one core. In addition, another R process now only receives 50% of the kernel.
OS - Ubuntu 12.04 64-bit. Hardware is Intel. R - version 2.15.2 "trick or treat"
Thoughts? (I know that I can just use snowfall, but I have a lot of variables, and I really don't want all sfExport everything!)
Edit: oh, I assume there is some kind of global castle somewhere? But still, why did the conflict arise between two completely separate processes R? I can run two R processes in parallel, simply, each of which accepts 100% of the main processor.
Edit2: Thanks to the Dirk pointer, I restored openblas, and now it looks much healthier:
