Why is subprocess output for a subprocess so unreliable with Python?

(Windows)

I wrote Python code that calls the SoX program (subprocess module) that outputs progress to STDERR if you specify it for this. I want to get the status of a percentage of the result. If I call it not from a Python script, it will start immediately and will have a smooth progression up to 100%.

If I call it from a Python script, it lasts a few seconds before it starts, and then it alternates between slow exit and fast exit. Although I read char on char, sometimes a big block breaks out there. Therefore, I do not understand why, at another time, I can see that the characters become more and more one after another. (It generates 15KiB of data in my test, by the way.)

I tested the same with mkvmerge and mkvextract. They also derive interest. Reading STDOUT is smooth.

This is so unreliable! How can I make reading the sox stderr stream smoother and possibly prevent a delay at the start?


How do I call and read:

process = subprocess.Popen('sox_call_dummy.bat', stderr = subprocess.PIPE, stdout = subprocess.PIPE) while True: char = process.stderr.read(1).encode('string-escape') sys.stdout.write(char) 
+3
source share
1 answer

According to this closely related thread: Unbuffered reading from a process using a subprocess in Python

 process = subprocess.Popen('sox_call_dummy.bat', stderr = subprocess.PIPE, bufsize=0) while True: line = process.stderr.readline() if not line: break print line 

Since you are not reading stdout, I don’t think you need a channel for it.

If you want to try reading char on char, as in your original example, try adding a flash every time:

 sys.stdout.write(char) sys.stdout.flush() 

Flushing stdout every time you write is the manual equivalent of disabling buffering for the python process: python.exe -u <script> or setting the env variable PYTHONUNBUFFERED=1

+1
source

All Articles