NodeJS and CSVtoJSON file stream

1.5k Views Asked by At

Expected functionality of the code:

  1. receive CSV file
  2. write it to a temporary directory
  3. convert that file to JSON by using csvtojson library
  4. add prefix data = to the file
  5. create JS file as a result, and write it to a different directory

The process works fine with short CSV files. When larger files are sent (CSV files containing more than 20.000 rows), result data is cut to approximately 1800-2500 rows. When I start the app locally and upload file for the first time, even when uploading large files, everything works fine. Everything is also fine if I change anything in readStream part of the code and server restarts, so I'm guessing that read stream needs to be reset somehow. I used this documentation to implement that part.

EDIT 03-09-2020: Once deployed to the server, app manages to handle even large CSV files

router.post("/", upload.single("productData"), (req, res, next) => {
  const product = new Product({
    _id: new mongoose.Types.ObjectId(),
    productData: req.file.path,
  });
  product
    .save()
    .then(() => {
      let nameOfNewDirectory = req.body.dirName;
      let dir = `./someDirectory/${nameOfNewDirectory}`;

      if (!fs.existsSync(dir)) {
        fs.mkdirSync(dir);
      }

      const readStream = fs.createReadStream(req.file.path);
      const writeStream = fs.createWriteStream(`./${dir}/data.js`, {
        flags: "w",
      });

      writeStream.write("data = ", () => {
        console.log("prefix to a data.js is written");
      });

      readStream
        .pipe(
          CSVToJSON({
            maxRowLength: 65535,
            checkColumn: true,
            downstreamFormat: "array",
            ignoreEmpty: false,
            delimiter: "auto",
          })
        )
        .pipe(writeStream);
    })
});
0

There are 0 best solutions below