Skip to content

Instantly share code, notes, and snippets.

@nawatts
Created May 27, 2016 03:32
Show Gist options
  • Star 20 You must be signed in to star a gist
  • Fork 5 You must be signed in to fork a gist
  • Save nawatts/e2cdca610463200c12eac2a14efc0bfb to your computer and use it in GitHub Desktop.
Save nawatts/e2cdca610463200c12eac2a14efc0bfb to your computer and use it in GitHub Desktop.

I recently came across a situation where I wanted to capture the output of a subprocess started by a Python script, but also let it print to the terminal normally. An example of where this may be useful is with something like curl, where progress is output to stderr (with the -o option). In an interactive program, you may want to show the user that progress information, but also capture it for parsing in your script. By default, subprocess.run does not capture any output, but the subprocess does print to the terminal. Passing stdout=subprocess.PIPE, stderr=subprocess.STDOUT to subprocess.run captures the output but does not let the subprocess print. So you don't see any output until the subprocess has completed. Redirecting sys.stdout or sys.stderr doesn't work because it only replaces the Python script's stdout or stderr, it doesn't have an effect on the subprocess'.

The only way to accomplish this seems to be to start the subprocess with the non-blocking subprocess.Popen, poll for available output, and both print it and accumulate it in a variable. The code shown here requires the selectors module, which is only available in Python 3.4+.

import io
import selectors
import subprocess
import sys
def capture_subprocess_output(subprocess_args):
# Start subprocess
# bufsize = 1 means output is line buffered
# universal_newlines = True is required for line buffering
process = subprocess.Popen(subprocess_args,
bufsize=1,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
universal_newlines=True)
# Create callback function for process output
buf = io.StringIO()
def handle_output(stream, mask):
# Because the process' output is line buffered, there's only ever one
# line to read when this function is called
line = stream.readline()
buf.write(line)
sys.stdout.write(line)
# Register callback for an "available for read" event from subprocess' stdout stream
selector = selectors.DefaultSelector()
selector.register(process.stdout, selectors.EVENT_READ, handle_output)
# Loop until subprocess is terminated
while process.poll() is None:
# Wait for events and handle them with their registered callbacks
events = selector.select()
for key, mask in events:
callback = key.data
callback(key.fileobj, mask)
# Get process return code
return_code = process.wait()
selector.close()
success = (return_code == 0)
# Store buffered output
output = buf.getvalue()
buf.close()
return (success, output)
@shirooo39
Copy link

This worked but how? what kind of magic is this?
Outputs are printed out line-by-line just like a normal console command

@wojrut97
Copy link

Seems that this solution does not work on Windows:
OSError: [WinError 10038] An operation was attempted on something that is not a socket

@jonhemphill
Copy link

An alternative solution I came across that works for me on Windows: https://stackoverflow.com/a/4417735

@gamesbook
Copy link

This does not show any output when using scp to upload a large file to a server.

@prrnnv
Copy link

prrnnv commented Nov 4, 2022

Worked like magic. I had wasted 3hours trying everything else, nothing worked but this one.

@ivanvoid
Copy link

ivanvoid commented Feb 1, 2023

Doesn't work for shell=True if you run one script from another form the shell.

@npodbielski
Copy link

Thank you. Works for me :)

@mrled
Copy link

mrled commented Apr 22, 2023

This was super helpful. With inspiration from this gist, I ended up writing a function that could do something similar while keeping stdout/stderr separate, shown here. I also had some problems trying to run it in a script remotely over SSH, and the post also discusses how I worked around those issues, in case anyone is trying to do that.

@guillemdc
Copy link

Thanks for this code snippet.

I'm using this to call bash commands from python. It works great when you don't need to keep the bash context by passing it ["bash","-c", statement] where statement is the bash command.

However, I'd like to keep a bash subprocess running and feeding it commands when I wanna. I've tried calling your snipped with only ["bash"] so it stays open, adding stdin as PIPE and then writing commands to stdin and it locks.

I don't know why that happens

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment