Download large csv file using json2csv in node js

1.9k Views Asked by At

i am using json2csv package for downloading my data it is working fine for small data. but if the records are more then 300 values it gets crashed. here is my code

const csvString = json2csv.parse(responseData);
    res.setHeader('Content-disposition', 'attachment; filename=shifts-report.csv');
    res.set('Content-Type', 'text/csv');
    res.status(200).send(csvString);

this code is working perfectly fine on small data how can i stream data when there is large amount of data using the same approach that i followed.


i am trying something like this but it gives me an error that cannot set the headers.

const headers = {
      'Content-type': 'text/csv',
      'Transfer-Encoding': 'chunked',
      'Content-Disposition': 'attachment; filename="file.csv"'
    };
    res.writeHead(200, headers);
    res.flushHeaders();
    const stream = new Writable({
      write(chunk, encoding, callback) {
        res.write(chunk);
        callback();
      }
    });
try {
        stream.write(file, 'utf-8');
      } catch (error) {
        console.log('error', error);
      }
    }
    res.end();
1

There are 1 best solutions below

4
On

You should use Json2CSV stream instead.

npm install json2csv-stream
const fs = require('fs');
const MyStream = require('json2csv-stream');


// create the one-time-use transform stream
const parser = new MyStream();

// create the read and write streams
const reader = fs.createReadStream('data.json');
const writer = fs.createWriteStream('out.csv');

//You can use writer to write it to the FS
// or can stream the content to the response object
// however, you need to write this code in any route handler
// which is sending the CSV data back to the user
reader.pipe(parser).pipe(res);

//reader.pipe(parser).pipe(writer);

For more details check here.