From the start:
I'm using XAMPP as a server stack for testing project which is something like google drive.
What I wanted to achieve is that User can upload any size and type of file.
I'm using for it Resumable.js which divides file into the chunks(1024KB) and sends it onto the server (Apache). I assume this will exclude changes in php.ini
file (upload_max_filesize
, post_max_size
) ?
For testing I've used 9,6GB Linux image. There were two problems:
- File was uploading 4,5 hours (TTFB increases every minute from 300ms to 10s close to the end of upload). What fix is possible?
- While upload "finishes" it sends
Error 500
and it not merge the files together. Is it possible to change something in Apache config? Probably response error?
When I've tested which are less than ~1GB it works. That was stress test. XAMPP is the blank install, nothing changed in any config file.
JS function handling uploading:
let resumable = new Resumable({
target: '{{ route('upload.large') }}',
query:{_token:'{{ csrf_token() }}'} ,// CSRF token
fileType: [],
headers: {
'Accept' : 'application/json'
},
testChunks: false,
throttleProgressCallbacks: 1,
});
Upload function in Controller:
public function upload(Request $request)
{
$receiver = new FileReceiver('file', $request, HandlerFactory::classFromRequest($request));
if (!$receiver->isUploaded()) {
// file not uploaded
}
$fileReceived = $receiver->receive(); // receive file
if ($fileReceived->isFinished()) { // file uploading is complete / all chunks are uploaded
$file = $fileReceived->getFile(); // get file
$extension = $file->getClientOriginalExtension();
$fileName = $file->getClientOriginalName(); //file name without extenstion
$disk = Storage::disk(config('filesystems.default'));
$path = $disk->putFileAs('videos', $file, $fileName);
// delete chunked file
unlink($file->getPathname());
return [
'path' => asset('storage/' . $path),
'filename' => $fileName
];
}
// otherwise return percentage informatoin
$handler = $fileReceived->handler();
return [
'done' => $handler->getPercentageDone(),
'status' => true
];
}
I'm getting close to solving this problem (which I'm having myself). It seems that the failure occurs when the chunks are being merged together to recreate the original file. I'm getting a "Failed to open input stream" once the file size hits 4.4GB. I'm suspecting the file append is limited or the memory isn't getting released. Hopefully this helps narrow it down.