I'm writing a script that appends JSON to a write stream using Node's archiver
package. archiver
then tarballs and gzips the write stream to a local file on disk.
When I run this script however, the process' memory consumption can get up to 2GB! Without using the gzip: true
flag in archiver
(that is, just using archiver
to append to stream and tar), the script only uses 300MB of memory.
I worked around this by spawning a child process to gzip the tarball, but I'd love to know two things:
- Why is
archiver
and by extension,zlib
, moving the whole tarball to memory for compression? - Is there a way to work around this without exec-ing to the shell?
Thanks!