Launch modules as subprocesses in the background, and detach

938 Views Asked by At

In bash, I can do the following:

for f in subdir/*.sh; do
    nohup "$f" "$@" &> /dev/null &
done

in other words, it runs all *.sh scripts in subdir in the background, and detached so that if the main script ends, the background scripts won't be terminated.

Now, let's say I have the following Python project:

proj/
    __init__.py
    main.py
    subdir/
        __init__.py
        mod_a.py
        mod_b.py
        mod_c.py

How do I do something similar to the bash script? But with parameters passed as Python objects?

E.g.: I have two strings a and b, a list l, and a dictionary d

  • Load mod_a.py, invoke mod_a.main(a, b, l, d), and detach
  • Load mod_b.py, invoke mod_b.main(a, b, l, d), and detach
  • Load mod_c.py, invoke mod_c.main(a, b, l, d), and detach
  • main.py can end, letting mod_a, mod_b, and mod_c run in the background until completion
3

There are 3 best solutions below

0
On

I don't know about any mechanism for it in python, but you may try to use nohup. You may try to run

nohup python your_script.py arguments

Using os.system or subprocess.call.

0
On

To emulate nohup in Python, you could make child processes to ignore SIGHUP signal:

import signal

def ignore_sighup():
    signal.signal(signal.SIGHUP, signal.SIG_IGN)

i.e., to emulate the bash script:

#!/bin/bash
for f in subdir/*.sh; do
    nohup "$f" "$@" &> /dev/null &
done

using subprocess module in Python:

#!/usr/bin/env python3
import sys
from glob import glob
from subprocess import Popen, DEVNULL, STDOUT

for path in glob('subdir/*.sh'):
    Popen([path] + sys.argv[1:], 
          stdout=DEVNULL, stderr=STDOUT, preexec_fn=ignore_sighup)

To create proper daemons, you could use python-daemon package.

1
On

It's probably a duplicate of Run a program from python, and have it continue to run after the script is killed and no, ignoring SIGHUP doesn't help, but preexec_fn=os.setpgrp does.