How to split a directory into parts without compressing or archiving?

510 Views Asked by At

How would one do this? If I have a folder that is 16GB, how would I split it into folders that cannot exceed 4GB? I also want to do this without using tar or similar utilities, unless using them does not create an archive that i would have to use split on.


EDIT:

To answer the comments...

When I mean a 16GB folder, I mean any folder that is 16GB in size and the file count can be anything.

If there is a file larger than what I want (in this case 4GB), I will use split to change that one file.

Basically at the end of this, I want to break a large folder into smaller folders, just not in archives.

This is a knapsack problem.

1

There are 1 best solutions below

1
On

The following Perl script is a proof of concept, you probably need to modify it to do what you want:

#!/usr/bin/env perl
use strict;
use warnings;

use File::Find;
use Algorithm::Bucketizer;

my $bucketizer = Algorithm::Bucketizer->new( bucketsize => 4 * 1024 * 1024 * 1024 );

find( { wanted => sub { $bucketizer->add_item( $_, -s ) if (-f) }, no_chdir => 1 }, '.' );
$bucketizer->optimize( algorithm => 'random', maxtime => 10, maxrounds => 100 );

for my $b ( $bucketizer->buckets ) {
    print "\nBucket " . $b->serial . " (" . $b->level . "):\n";
    print "$_\n" for ( $b->items );
}

It looks for files in the current directory, and silently ignores the ones larger than 4 GB. It uses the Algorithm::Bucketizer module. Please see the docs for the meaning of the arguments of optimize().