I'm trying to copy a 10GB file to another directory in my local disk using this code Storage::copy( 'file/test.txt', 'file2/dest.txt' );
But when I check it on the destination path it only copied 1.7GB out of 10GB. It didn't show any timeout errors at all. Is there any work around on this?
as long as invoking php script will be happen with a request. a malicious code can turns system into an infinite loop, or an attacker (Danial of Service) can make a script run for a long time.
so to preventing this. PHP came up with some default setting which make it to not run for a long time period and exit the execution if some amount of time out passed and it didn't returned a response.
usually this times are sufficient. and its an unusual case you have to copy 10 GB file within a script run
Temporary fix
10000 MB / 60 MBps = 168 second. i come up with 180 second via a margin
in your php.ini , in Resource Limit Section, there are parameters for time out. uncomment them and make them 180 seconds: