I'm trying to run
fswatch -tr /home/*/*/public_html | grep --line-buffered -E ".php|.xml" | awk '!seen[$0]++' >> log.txt
or equivalently (by using uniq):
stdbuf -i0 -o0 -e0 fswatch -tr /home/*/*/public_html | grep --line-buffered -E ".php|.xml" | uniq >> log.txt
So that I don't get duplicate rows. It works just fine in the terminal, with standard output, however when I'm trying to write that output to log.txt, the file is blank (or no new rows are inserted if using >>).
fswatch is a command that "monitors" changes to the filesystem in real time, and it generates a lot of duplicate events and uniq seems to address that just fine.
Any ideas why the output redirection doesn't work?
awkanduniqare going to buffer their output when writing to a regular file. You can get unbuffered behavior withperl:That is the
perlequivalent ofawk '!seen[$0]++', but setting$|non-zero makes the output unbuffered. To be more correct you should probably writeBEGIN{$|=1}so you're not making the assignment on every line of input, but it's not really necessary.