When using NSFileWrapper, is there a way to handle huge output files by writing multiple times with chunks of NSData objects?

229 Views Asked by At

I a macOS Cocoa application, I need to save a Bundle in my app proprietary format. For this reason, I decided to look at NSFileWrapper that appears to be the cleanest solution to deal with the problem. My code looks like this:

NSFileWrapper *bundleFileWrapper = [[NSFileWrapper alloc] initDirectoryWithFileWrappers:nil];

NSDictionary *fileWrappers = [bundleFileWrapper fileWrappers];

if ([fileWrappers objectForKey:mboxFileName] == nil) {
    NSFileWrapper *textFileWrapper = [[NSFileWrapper alloc] initRegularFileWithContents:mboxData];
    [textFileWrapper setPreferredFilename:mboxFileName];
    [bundleFileWrapper addFileWrapper:textFileWrapper];
}
NSError *error;

BOOL success = [bundleFileWrapper writeToURL:[NSURL fileURLWithPath:path] options:NSFileWrapperWritingAtomic originalContentsURL:NULL error:&error];

NSLog(@"Error = %@",[error localizedDescription]);

My problem is that to I end up using very large NSData objects and this approach takes a lot of memory. Is there a way using NSFileWrapper to write small NSData objects in sequence ? Any help is greatly appreciated.

1

There are 1 best solutions below

0
James Bucanek On

Consider not using NSFileWrapper for your large data files. Use NSFileWrapper for the directory structure, but write big data files directly (open, write, ...).

Also, don't be shy about simply creating the directory structure yourself. A package can be as simple as, literally, a folder with an extension. And there are still tons of O-O APIs (NSURL, NSFileManager, ...) to help you examine and manipulate its content.

Good luck!