I'm using Spout for reading an Excel file of over 500.000 records (with 7 columns each, not too much info).
The problem is my script is getting timmed out. I've tried uploading this limits and it gets better, but so far I couldn't make a complete insert, just partial ones arround 50.000 rows.
This is not an option for me. Is there any way to split this Excel file but on the code? What I see is that manipulating the file even if it's not inserting into database is already slow and times out.
So... any advice?
Thanks!
You can try calling
set_time_limit()repeatedly, for example after every row you insert. It resets the time limit each time you call it. If your server administrator has set up a global time limit this won't allow you to exceed that, however.But inserting half a million rows one by one into an InnoDB table in MySQL is inherently slow because it needs to do an autocommit after every row.
If you do the insert in batches you'll gain a lot of speed. For example, your are probably doing something like this now:
Instead do this:
That way you'll incur the commit overhead on MySQL for every five rows rather than for every one. Notice that you can put many rows into a single INSERT, not just five. MySQL's only limit on query length can be found with
SHOW VARIABLES LIKE 'max_allowed_packet';.Of course this a little more complex to program, but it's much faster.