Run curl for 20k in a sequence and save data in excel: fails

71 Views Asked by At

I need to run around 20K curl request, fetch the data from the request and save it in excel using Spout. Important is that I need to be run in sequence and save data in sequence.

I have set the PHP timeout to 0, but it still fails. Shows an empty page with no warning and no success message and no product is saved in xls file. If I run it for 1000 products or so then it works fine. How can I fix this?

After getting the XML by sending the curl I am doing a foreach to get the required data and

$writer->openToFile('products.xlsx');

foreach ($xml->Body->product as $products) {
   $values = [ $products ];
   $rowFromValues = WriterEntityFactory::createRowFromArray($values);
   $writer->addRow($rowFromValues);
}

$writer->close();

function getCurl($url, $soapBody) {
    $curl = curl_init();
    curl_setopt_array($curl, array(
        CURLOPT_URL => $url,
        CURLOPT_RETURNTRANSFER => true,
        CURLOPT_ENCODING => "",
        CURLOPT_MAXREDIRS => 10,
        CURLOPT_TIMEOUT => 30,
        CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
        CURLOPT_CUSTOMREQUEST => "POST",
        CURLOPT_POSTFIELDS => $soapBody,
        CURLOPT_HTTPHEADER => array(
            "Content-Type: text/xml",
            "cache-control: no-cache"
        ),
    ));
    $response = curl_exec($curl);
    $err = curl_error($curl);
    if ($err) {
        echo "cURL Error #:" . $err;
    } else {
        curl_close($curl);
        return $response;
    }
}
0

There are 0 best solutions below