bash wrap a piped command with a python script

1000 Views Asked by At

Is there a way to create a python script which wraps an entire bash command including the pipes.

For example, if I have the following simple script

import sys
print sys.argv

and call it like so (from bash or ipython), I get the expected outcome:

[pkerp@pendari trell]$ python test.py ls
['test.py', 'ls']

If I add a pipe, however, the output of the script gets redirected to the pipe sink:

[pkerp@pendari trell]$ python test.py ls > out.txt

And the > out.txt portion is not in sys.argv. I understand that the shell automatically process this output, but I'm curious if there's a way to force the shell to ignore it and pass it to the process being called.

The point of this is to create something like a wrapper for the shell. I'd like to run the commands regularly, but keep track of the strace output for each command (including the pipes). Ideally I'd like to keep all of the bash features, such as tab-completion and up and down arrows and history search, and then just pass the completed command through a python script which invokes a subprocess to handle it.

Is this possible, or would I have to write my own shell to do this?

Edit

It appears I'm asking the exact same thing as this question.

3

There are 3 best solutions below

3
On

Well, I don't quite see what you are trying to do. The general approach would be to give the desired output destination to the script using command line options: python test.py ls --output=out.txt. Incidentally, strace writes to stderr. You could capture everything using strace python test.py > out 2> err if you want to save everything...

Edit: If your script writes to stderr as well you could use strace -o strace_out python test.py > script_out 2> script_err

Edit2: Okay, I understand better what you want. My suggestion is this: Write a bash helper:

function process_and_evaluate()
{

  strace -o /tmp/output/strace_output "$@"

  /path/to/script.py /tmp/output/strace_output
}

Put this in a file like ~/helper.sh. Then open a bash, source it using . ~/helper.sh. Now you can run it like this: process_and_evaluate ls -lA.

Edit3: To capture output / error you could extend the macro like this:

function process_and_evaluate()
{
  out=$1
  err=$2

  shift 2

  strace -o /tmp/output/strace_output "$@" > "$out" 2> "$err"

  /path/to/script.py /tmp/output/strace_output
}

You would have to use the (less obvious ) process_and_evaluate out.txt err.txt ls -lA. This is the best that I can come up with...

3
On

The only thing you can do is pass the entire shell command as a string, then let Python pass it back to a shell for execution.

$ python test.py "ls > out.txt"

Inside test.py, something like

subprocess.call("strace " + sys.argv[1], shell=True, executable="/bin/bash")

to ensure the entire string is passed to the shell (and bash, specifically).

0
On

At least in your simple example, you could just run the python script as an argument to echo, e.g.

$ echo $(python test.py ls) > test.txt
$ more test.txt
['test.py','ls']

Enclosing a command in parenthesis with a dollar sign first executes the contents then passes the output as an argument to echo.