I have a large number of files in directory - ~100k. I want to combine them and pipe them to standard output (I need that to upload them as one file elsewhere), but cat $(ls)
complains that -bash: /bin/cat: Argument list too long
. I know how to merge all those files into a temporary one, but can I just avoid it?
concat a lot of files to stdout
113 Views Asked by Mikhail Golubtsov At
2
There are 2 best solutions below
3
On
find . -type f -print0 |xargs -0 cat
xargs
will invoke cat
several times, each time with as many arguments as it can fit on the command line (the combined length of the args can be no more than getconf ARG_MAX
).
-print0
(seperate files with \0
) for find
in combination with -0
(process files separated with \0
) for xargs
is just a good habit to follow as it will prevent the commands from breaking on filenames with special or white characters in them.
For a start,
cat $(ls)
is not the right way to go about this -cat *
would be more appropriate. If the number of files is too high, you can usefind
like this:This combines results from
find
and passes them as arguments tocat
, executing as many separate instances as needed. This behaves much in the same way asxargs
but doesn't require a separate process or the use of any non-standard features like-print0
, which is only supported in some versions offind
.find
is recursive by default, so you can specify a-maxdepth 1
to prevent this if your version supports it. If there are other things in the directory, you can also filter by-type
(but I guess there aren't, based on your original attempt).