I am running a Perl script through a subprocess module in Python on Linux. The function that runs the script is called several times with variable input.
def script_runner(variable_input): out_file = open('out_' + variable_input, 'wt') error_file = open('error_' + variable_input, 'wt') process = subprocess.Popen(['perl', 'script', 'options'], shell=False, stdout=out_file, stderr=error_file)
However, if I run this function, say, twice, the execution of the first process will stop when the second process begins. I can get the desired behavior by adding
process.wait()
after calling the script, so I'm not stuck. However, I want to find out why I cannot start the script process as many times as I want, and let the script execute these calculations in parallel, without waiting for it to complete each run.
UPDATE
The culprit was not so exciting: the perl script used a shared file that was rewritten for each execution.
However, the lesson I learned from this was that the garbage collector does not delete this process after it starts, because it did not affect my script as soon as I parsed it.
python subprocess
Viktiglemma
source share