Tail -f piped to > awk piped to file > file does not work

1.6k Views Asked by At

Having trouble to wrap my head around piping and potential buffering issue. I am trying to perform set of operations piped that seem to break at some piping level. To simplify , I narrowed it down to 3 piping operations that do not work correctly

tail -f | awk '{print $1}' > file

results in no data redirected to the file , however

tail -f | awk '{print $1}'

results are output to stdout fine

also

tail -10 | awk '{print $1}' > file

works fine as well.

thinking it might be buffering issue, tried

tail -f | unbuffer awk '{print $1}' > file

what produced no positive results

(note: in original request, i have more operation in between using grep --line-buffer, but the problem was narrowed down to 3 piped commands tail -f | awk > file

2

There are 2 best solutions below

11
On

The following will tail -f on a given file and whenever new data is added will automatically execute the while loop:

tail -f file_to_watch | while read a; do echo "$a" |awk '{print $1}' >> file;  done

or more simply if you really only need to print the first field you could read it directly to your variable like this:

tail -f file_to_watch | while read a b; do echo "$a" >> file;  done
0
On

Here is how to handle log files:

tail --follow=name logfile | awk '{print $1 | "tee /var/log/file"}' 

or for you this may be ok:

tail -f | awk '{print $1 | "tee /var/log/file"}'

--follow=name this prevents stop of command while log file are rolled.
| "tee /var/log/file" this is used to get the output to the file.