PHP iterating through huge array from mysql

669 Views Asked by At

I am processing a large database move for a client. We have a query which pulls a large amount of data, which needs to be processed and then inserted into another table. The original query is a combination of joining several tables together to get it in the required format.

We are using are own database class to put the entire results of the query (about 600,000 rows with 30 columns of data which is mainly strings). This array is passed into another class (which extends ArrayIterator) to do some manipulation then insert into the target database table.

what are are finding is that it can process small data sets very well and is resonably quick. But with this import over over half a million rows it seems to be gettting increasinly slower. We have calcualted an estimate time remainiang using time it took to process the past imports, but it seems to be estimating 2 hours raining for the past 3 hours.

Initially it was processing a hundred or so per second now its only 2-3.

here is an example of our code:

$sql = "some query";
$array = new Import($sql);

foreach($array as $data){
    $db->insert('table', $data);
}

A demo of our iterator class:

 class Import extends ArrayIterator {

    public function __construct($sql){
        //database class parses query and returns all the results into an array
        $array = $db->getArray($sql)

        parent::__construct($array);
    }

    public function current() {
        $data = parent::current();

        //come data processing...

        return $data; 
    }

}

any ideas to the gradual slowdown? Any suggestions on improving its efficiency?

0

There are 0 best solutions below