i am using json2csv package for downloading my data it is working fine for small data. but if the records are more then 300 values it gets crashed. here is my code
const csvString = json2csv.parse(responseData);
res.setHeader('Content-disposition', 'attachment; filename=shifts-report.csv');
res.set('Content-Type', 'text/csv');
res.status(200).send(csvString);
this code is working perfectly fine on small data how can i stream data when there is large amount of data using the same approach that i followed.
i am trying something like this but it gives me an error that cannot set the headers.
const headers = {
'Content-type': 'text/csv',
'Transfer-Encoding': 'chunked',
'Content-Disposition': 'attachment; filename="file.csv"'
};
res.writeHead(200, headers);
res.flushHeaders();
const stream = new Writable({
write(chunk, encoding, callback) {
res.write(chunk);
callback();
}
});
try {
stream.write(file, 'utf-8');
} catch (error) {
console.log('error', error);
}
}
res.end();
You should use
Json2CSV stream
instead.For more details check here.