How To Upload A Large File (>6MB) To SalesForce Through A Lightning Component Using Apex Aura Methods

3.6k Views Asked by At

I am aiming to take a file a user attaches through an Lightning Component and create a document object containing the data.

So far I have overcome the request size limits by chunking the data being uploaded into 1MB chunks. When the Apex Aura method receives these chunks of data it will either create a new document (if it is the first chunk), or will retrieve the existing document and add the new chunk to the end.

Data is received Base64 encoded, and then decoded server-side.

As the document data is stored as a Blob, the original file contents will be read as a String, and then appended with the chunk received. The new contents are then converted back into a Blob to be stored within the ContentVersion object.

The problem I'm having is that strings in Apex have a maximum length of 6,000,000 or so. Whenever the file size exceeds 6MB, this limit is hit during the concatenation, and will cause the file upload to halt.

I have attempted to avoid this limit by converting the Blob to a String only when necessary for the concatenation (as suggested here https://developer.salesforce.com/forums/?id=906F00000008w9hIAA) but this hasn't worked. I'm guessing it was patched because it's still technically allocating a string larger then the limit.

Code's really simple when appending so far:

ContentVersion originalDocument = [SELECT Id, VersionData FROM ContentVersion WHERE Id =: <existing_file_id> LIMIT 1];

Blob originalData = originalDocument.VersionData;
Blob appendedData = EncodingUtil.base64Decode(<base_64_data_input>);
Blob newData = Blob.valueOf(originalData.toString() + appendedData.toString());
        
originalDocument.VersionData = newData;
2

There are 2 best solutions below

1
On

You will have hard time with it.

You could try offloading the concatenation to asynchronous process (@future/Queueable/Schedulable/Batchable), they'll have 12MB RAM instead of 6. Could buy you some time.

You could try cheating by embedding an iframe (Visualforce or lightning:container tag? Or maybe a "canvas app") that would grab your file and do some manual JavaScript magic calling normal REST API for document upload: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_sobject_insert_update_blob.htm (last code snippet is about multiple documents). Maybe jsforce?

Can you upload it somewhere else (SharePoint? Heroku?) and have that system call into SF to push them (no Apex = no heap size limit). Or even look "Files Connect" up.

Can you send an email with attachments? Crude but if you write custom Email-to-Case handler class you'll have 36 MB of RAM.

You wrote "we needed multiple files to be uploaded and the multi-file-upload component provided doesn't support all extensions". That may be caused by these:

  • In Experience Builder sites, the file size limits and types allowed follow the settings determined by site file moderation.
  • lightning-file-upload doesn't support uploading multiple files at once on Android devices.
  • if the Don't allow HTML uploads as attachments or document records security setting is enabled for your organization, the file uploader cannot be used to upload files with the following file extensions: .htm, .html, .htt, .htx, .mhtm, .mhtml, .shtm, .shtml, .acgi, .svg.
0
On

What I did, and i'm not sure if this will work in your case, but I have a LWC form that creates records and then i have another page when they click next and i nest the file upload component inside the parent and pass the child the created record id. From there a user can upload using the lightning-file-upload which allows up to 2GB.