What can I do to prevent running out of memory when working with large (around 11000 records) datasets ?
The problem
When using mPDF with PHP, I am trying to create PDF files from large datasets (around 11000 records) which is resulting in errors like (memory values fluctuate):
Fatal error: Out of memory (allocated 1197211648) (tried to allocate 44 bytes) in projectfolder\mpdf\mpdf.php on line 24132
What I tried
It works fine on smaller datasets, and I have tried searcing Stackoverflow articles and other google articles resulting in me making the following changes to my php.ini file:
memory_limit=-1
max_execution_time=0
post_max_size=0
My laptop specifications (where the script is being run) :
- 8GB RAM
- i7 processor
- 64bit OS
- XAMPP
mPDF is, sadly, not optimized to work with large datasets resulting in large HTML.
A thing I would recommend is to create multiple smaller PDF documents if you can and later concatenate them with an external tool such as ghostscript.