Python: how to get the final output of several system commands?

There are many posts in SO, like this: Save subprocess output. Call the call in line

There is a problem with complex commands. For example, if I need to get the result from this

ps -ef | grep something | wc -l

Subprocess will not do the job because the argument for the subprocess is [program name, arguments], so it is impossible to use more complex commands (more programs, pipes, etc.).

Is there a way to capture the output of a chain from multiple commands?

+4
source share
3 answers

Just pass shell=True parameter for subprocess

 import subprocess subprocess.check_output('ps -ef | grep something | wc -l', shell=True) 
+5
source

For a clean version using the subprocess module, you can use the following example ( from the documentation ):

 output = `dmesg | grep hda` 

becomes

 p1 = Popen(["dmesg"], stdout=PIPE) p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE) p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits. output = p2.communicate()[0] 

The Python program essentially does what the shell does: it sends the output of each command to the next one in turn. The advantage of this approach is that the programmer has full control over individual standard output errors of commands (they can be suppressed, if necessary, logged, etc.).

However, I prefer to use the subprocess.check_output('ps -ef | grep something | wc -l', shell=True) approach subprocess.check_output('ps -ef | grep something | wc -l', shell=True) to delegate the shell suggested by nneonneo instead: it is generic, very picky, and convenient.

+5
source

Well, another alternative would be to simply implement part of the command in plain Python. For instance,

 count = 0 for line in subprocess.check_output(['ps', '-ef']).split('\n'): if something in line: # or re.search(something, line) to use regex count += 1 print count 
+3
source

All Articles