Laravel Excel Queue consume too much RAM

973 Views Asked by At

I have set up Laravel queue to read excel files using your Laravel excel and it works great for small files.

But for large files (100+mb) and 400k+ records, it takes too much time and consumes almost 40GB RAM of the server.

I have set up a supervisor to run queue: work command. My server RAM is 60GB. For small files, it all works fine but it's not working for large files.

I have also checked query times by using telescope but no query is taking large times.

2

There are 2 best solutions below

0
On BEST ANSWER

For all those who are facing such issue, i suggest to use Spout. It works like charm. I have tried 3 PHP services for this and in the end, only spout worked.

https://opensource.box.com/spout/

https://github.com/box/spout

1
On

At this point, there is no straight-cut answer to your problem. This very much depends on your target result. You would have to engineer your way through it.

One thing, on the top of my head, is to chunk or partition your large excel files and feed them to your queue. Maybe you can utilize laravel job batching.

Another thing you can introduce is a microservice system, where these kinds of heavy lifting tasks would be done by another better machine.

But like I said, there is no one single solution for a problem like this. You have to calculate and find these by yourself.