I'm trying to parse multiple CSV files into one JSON file.
My problem is that that I cannot reuse the read stream as it does not update the file it's reading from. So for each iteration of the loop, I keep reading the first csv file passed to it.
Here is the code I'm using to read all CSV files in a directory, and then wrapping an instance of createReadStream
in a Promise, awaiting the result and then moving on to parse the next CSV file. (It does not read the values of the next CSV file though, it just returns the result of the first CSV file passed to it each time)
Here is the code I'm using
const path = require("path");
const fs = require("fs");
const parse = require("csv-parse");
const parser = parse({
delimiter: ","
});
function parseCSVFile(file) {
return new Promise((resolve, reject) =>
fs
.createReadStream(file)
.pipe(parser)
.on("data", (data) => {
console.log(data);
})
.on("end", () => {
resolve();
})
.on("error", (error) => reject(error))
)
}
async () => {
const csvFiles = fs.readdirSync(path.join(__dirname, "/csvFiles"));
for (let file of csvFiles) {
await parseCSVFile(path.join(file);
}
})();
Any help/advice is much appreciated!