What is the best way to combine a large number of small files to maximize recoverability

180 Views Asked by At

I have sets of large numbers of files (~5000 per directory) which are significantly slowing down my file system access. I have plenty of space, and the data is important. I'd like to combine them into a single file per directory. Creating an archive would be the simple solution, but I don't want to reduce the recoverability. Some sort of flat image (e.g., an uncompressed tar-file) would work fine, but I would think there's a format that could actually be more recoverable (e.g., by storing parity information) in the same amount of space. I'm working in an mixed unix/linux/mac environment.

Is there an image/compression format that minimizes compression while providing parity-type information, or would a raw image be the maximally recoverable file format?

1

There are 1 best solutions below

2
On

You may be able to solve your performance problem simply by creating a deeper tree of subdirectories with far fewer files in each directory.