I do not want to run a subprocess for each use, since I need it several hundred thousand times (this is a special type of data analyzer).
Bad politics. Linux base shells do this deployment process all the time. Preventing spawning is a bad limitation.
However, you can do this trivially.
python prepare_data.py | perl my_magic_module.pl | python whatever_else.py
Wrap your magic module in a simple perl script that reads from stdin, does the magic thing, and writes to stdout.
Divide your Python into two parts: the part is done before the perl call and the part that is executed after the perl call.
Assemble a high-performance conveyor that (a) performs all three steps at the same time and (b) does not develop many processes.
This, BTW, will also use your every core.
source share