Set line-buffering in container output

862 Views Asked by At

I use Java S2I image for a container running in Openshift (on premise). My problem is that the output of the image is page-buffered and oc logs ... does not show me the last logs.

I could probably spin up my docker image that would do stdbuf -oL -e0 java ... but I would prefer to stick to the 'official' image (just adding the jar to /deployments). Is there any way to reduce buffering (use line-buffering instead of page-buffering), or flush the output on demand?

EDIT: It seems that I could update deployment config and pass stdbuf in there, but that means that I'd have to compose all the args myself. Ideal solution would be passing --tty do Docker, but I can't see how a custom arguments could be passed that way in Openshift.

1

There are 1 best solutions below

1
On

In your repo, try creating the file .s2i/bin/run. In it add:

#/bin/bash

exec stdbuf -oL -e0 /usr/local/s2i/run

I always forget where the S2I assemble and run scripts are in the Java S2I image, so you may need to replace /usr/local/s2i with the correct path.

What adding this file does is that it will be run as the startup command instead of the original run script. You can then run the original script with stdbuf. Ensure you use exec so that the sub process replaces the current one, else signals will not be propagated through properly.

Even though this might work, am surprised logging isn't working in an unbuffered mode already. I expect there would be a better way of controlling it through some Java config instead.