I currently have a large array of data that I need to write to a new Excel file, however every time that I close the stream to the file, the process fails with the error
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
I have been able to perform this operation using a smaller set of data but have been having troubles with what is probably around 1m+ rows of data. The code below is what I have gotten to work previously with the smaller set of data
return new Promise(async (resolve, reject) =>{
const file = new Excel.Workbook();
file.addWorksheet("Other");
file.addWorksheet("Apparel & Merchandise");
file.addWorksheet("ATV, Side-by-Side & UTV Parts");
file.addWorksheet("Motorcycle & Scooter Parts");
file.addWorksheet("Performance & Racing Parts");
file.addWorksheet("RV, Trailer, & Camper Parts");
file.addWorksheet("Car & Truck Parts & Accessories");
file.addWorksheet("Automotive Tools & Supplies");
file.addWorksheet("Safety & Security Accessories");
sheetData.forEach((aspectArray) =>{
const workSheet = file.getWorksheet(aspectArray.sheetIdentifier);
workSheet.addRow(["CategoryID", "Parent Category", "Leaf of the Parent Category", "Category Name", "Aspect Constraint", "Aspect Value", "Aspect Cardinality", "Aspect Mode", "Required?"])
workSheet.addRows(aspectArray.data);
})
await file.xlsx.writeFile(os.tmpdir() + "/eBayUSMotorsItemAspects.xlsx").then((err, resolution) => {
if(err){
reject(console.log(err));
}else{
delayFunctionCall(3000).then(()=>{
email.emailWorkBook();
})
resolve("The workbook is being emailed");
}
})
})
The above code works fine, but does not run with a larger set of data to put into an Excel file. Like I said, the "Javascript heap out of memory" error occurs when working with more data. From my testing with some console.log()
's, process seems to fail at the file.xlsx.writeFile()
function or in the below example the writeableStream.end()
. I have since tried writing the data to a stream and then closing the stream once I am finished like so
const writeableStream = fs.createWriteStream(os.tmpdir() + "/eBayUSItemAspects.xlsx");
return new Promise(async (resolve, reject) =>{
const file = new Excel.Workbook();
file.addWorksheet("Business & Industrial");
file.addWorksheet("Consumer Electronics");
file.addWorksheet("Home & Garden");
file.addWorksheet("Sporting Goods");
file.addWorksheet("Toys & Hobbies");
sheetData.forEach((aspectArray) =>{
const workSheet = file.getWorksheet(aspectArray.sheetIdentifier);
workSheet.addRow(["CategoryID", "Parent Category", "Leaf of the Parent Category", "Category Name", "Aspect Constraint", "Aspect Value", "Aspect Cardinality", "Aspect Mode", "Required?"])
aspectArray.data.forEach((value)=> {
file.xlsx.write(writeableStream);
workSheet.addRow(value).commit();
})
})
writeableStream.end();
writeableStream.on('finish', (err, results) => {
if(err){
reject(console.log(err));
}else{
delayFunctionCall(3000).then(()=>{
email.emailWorkBook();
})
resolve("The workbook is being emailed");
}
});
})
sheetData
is an array of objects. The objects consist of a sheetIdentifier property and an array of data values that are to be added to the sheet matching the sheetIdentifier. Unfortunately the data cannot be streamed directly from the API I am getting the information from because the API response data needs to be manipulated for our business
How can I send the data to an Excel file without running out of memory for the process? Is there something I am missing about file streams that is using an unnecessary amount of memory?
Instead of writing the file to memory and then adding content to the file's stream, I needed to create a writable stream first and then start sending data to the stream.
The rows of data are "committed" to the file stream after they are added. After the data has been fully sent to the writable stream, the file is then committed/written to memory.