How to automatically run a bash script when my qsub jobs complete on the server?

I would like to run the script when all the tasks that I sent to the server are complete.

for example i send

ssh server "for i in config*; do qsub ./run 1 $i; done"

And I will return the list of jobs that were running. I would like to automatically run another script on the server to process the output from these jobs as soon as everything is complete.

I would appreciate any advice that would help me avoid the following inelegant solution:

If I save each of the 1000 job identifiers from the above call in a separate file, I could check the contents of each file for the current list of running tasks, that is, to exit the call:

ssh qstat

I would have to check every half hour, but I would suggest that there is a better way.

+5
3

, , , , , .

( ..) , (. http://docs.adaptivecomputing.com/torque/4-1-4/Content/topics/commands/qsub.htm#-t). , ; , ; qsub , qdel qhold ( ).

, , , : (cf. http://docs.adaptivecomputing.com/torque/4-1-4/Content/topics/commands/qsub.htm#dependencyExamples). :

qsub analyze.sh -W depend=afterokarray:427[]

.sh script, , 427 - , . ( [] ). (, SGE/OGE), .

, , , , ; , , .

+6

-, , , script , $i.jobdone, script ls *.jobdone | wc -l .

+6

wait, , . (, , ..), .

C ( ), bash wait , .

: .

#!/bin/bash

...
waitfor=''

for i in tasks; do
    task &
    waitfor="$waitfor $!"
done

wait $waitfor
...

If you run this script in the background, it does not bother you, and what happens after the wait line is run when your tasks are complete.

+1
source

All Articles