I am implementing AMPHP for multithreading, replacing an existing "multicurl callback" solution where new php processes were created by doing an external call to the server from within the code.
Given I have an 8 core CPU, when I submit 50 "Task"s from Amp\Parallel\Worker\Task which each call file_get_contents on a url, what happens?
Is it that 50 threads/processes will be created, each of which will hog a core until their file_get_contents is done? Or is it that they will suspend and resume the processes as needed?
From this thread: Running multiple processes in parallel in php It sounds like they will hog cores...
If I switch file_get_contents to the AMPHP http-client, will this behaviour change (because it's "non-blocking"), even if I don't use Promises?
I don't write StackOverflow questions very much, so please let me know if I need to edit my question for clarification.
This is only a partial answer to this question, but I have done some tests which show that with a simple substitution of file_get_contents for the AMPHP http-client HttpClientBuilder, I have got significant improvements:
My test function is:
Where
FileReadTaskis:This suggests that Amp http-client is "less blocking" than file_get_contents, but only when you use it in the environment of a worker.