I'm trying to use DaemonContext as follows:
with DaemonContext(
working_directory=WORKDIR,
umask=0o002,
pidfile=pidfile.TimeoutPIDLockFile(PIDFILE),
):
main()
I get the error ValueError: I/O operation on closed file. It's unclear if this has to do with the pidfile, stdout/stderr, or something I'm doing inside main(). This is running inside a Docker container.
Here's my stack trace. Grateful for any insights.
Traceback (most recent call last):
...
pidfile=pidfile.TimeoutPIDLockFile("/tmp/cwmetrics.pid"),
File "/usr/local/lib/python3.7/site-packages/daemon/daemon.py", line 272, in __init__
detach_process = is_detach_process_context_required()
File "/usr/local/lib/python3.7/site-packages/daemon/daemon.py", line 819, in is_detach_process_context_required
if is_process_started_by_init() or is_process_started_by_superserver():
File "/usr/local/lib/python3.7/site-packages/daemon/daemon.py", line 795, in is_process_started_by_superserver
stdin_fd = sys.__stdin__.fileno()
ValueError: I/O operation on closed file
I did report it over here as well: https://pagure.io/python-daemon/issue/64 I'm just unclear if it's an issue with the library or my usage of it.
You are trying to detach (daemonize) process, but still want to preserve access to default stdin/stdout - but your detached process will not have files to write to. You should create some files, which would be used as stdin/stdout and point to them in arguments.
DaemonContext closes all opened streams on entering context. If you redirects your stdin/stdout somewhere in your code, you may avoid closing them with argument
files_preserve=[...], where you should put those files in the list.