How to execute more than one perl script in parallel on a Linux machine?

I want to run all the script from the directory. For instance,

The directory contains 40 scripts, I would like to start the 5 script parallel.after first, after the completion of these scripts the following 5 script will be executed, as well as the rest.

Please give any solutions using linux and perl commands.

+4
source share
6 answers

When you are in the script folder, run this bash script / command:

for var in $(ls *.pl) do $var & # 5 simultaneous jobs while test "$(jobs |wc -l)" -gt 4 do sleep 1 done done 

it depends on the fact that you do not have other background tasks, check this by writing "tasks" on the command line.

+1
source

Everyone loves to invent parallel execution tools.

+10
source

Can you use nohup for the first 5 scripts? Ask the 5th script to write to some file that it completed and continue on.

+2
source

Do not forget that GNU Make! Use a make file like this and run it with the -j option. See http://www.gnu.org/software/make/manual/make.html#Parallel

 scripts=$(wildcard *.sh) all: $(patsubst %.sh,%.out,$(scripts)) %.out: %.sh sh $< > $@ 2>&1 

If you worked in Perl, I would suggest Parallel :: ForkManager

Oh, and it seems that xargs on Linux has the -P option to run jobs in parallel. I have not used it since my GNU Make trick precedes this.

+1
source
 You can write sth like below. It pseudo code. #!/bin/ksh dir=$1 count=0 for script in $1/* do count++ $i & add the pid to pidList array if( count==5){ while(waitforProcessComplete(pidList){ } } done 
0
source

I would prefer the GNU parallel. It's easy and fun - and the documentation contains a lot of neat examples.

A big advantage: you do not need to write any code or Makefile to run your scripts. Example:

 # create the example scripts: mkdir scripts for i in {1..50}; do {echo '#!/bin/sh'; echo "echo $i"; echo "sleep 1" } > scripts/s$i.sh; done # use parallel to run all these scripts, 5 at a time: parallel -j 5 ::: scripts/*.sh 
0
source

All Articles