Distributed scheduling system for R-scripts

I would like to plan and distribute the execution of R-scripts (for example, RServe) on several machines - Windows or Ubuntu - (one task - only on one machine).

I do not want to reinvent the wheel and would like to use a system that already exists to distribute these tasks in an optimal way and ideally have a graphical interface for the correct execution of scripts.

1 / Is there an R package or library that can be used for this?

2 / One library that seems to be widely used is mapReduce with Apache Hadoop. I have no experience in this structure. What installation / plugin / setting would you recommend for my purpose?

Edit: here is more detailed information about my setup:
I really have an office full of machines (small servers or workstations) that are sometimes also used for other purposes. I want to use the processing power of all these machines and distribute my R-scripts to them.
I also need a scheduler, for example. a tool for planning scenarios during correction or regularly. I use both Windows and Ubuntu, but a good solution on one of the systems would be enough. Finally, I do not need a server to return the result of the scripts. Scripts do things such as accessing a database, saving files, etc., but returning nothing. I just would like to return errors / warnings, if any.

+5
2

, , , , doRedis foreach. vignette PDF, . :

doRedis? , foreach , doMC, doSNOW doMPI. doRedis . , . , , . \ " , . , doRedis R

Hadoop , , Hadoop, . Hadoop, , / algo , Hadoop.

, , ? , , R-? ? EC2 ""?

, , .

, node, R-, ​​, TakTuk dsh, .

+2
+1

All Articles