Processing a csv file with unequal amount of columns in nodes

400 Views Asked by At

I have the following function which processes my csv file. Unfortunately the csv file has one column where it also uses the comma as thousand separator (I have no influence over this exported csv file and its structure). So in every file from a certain row there will be one extra column.

In the on('data', ()) method, I've already fixed this value by joining the two fields together and deleting the redundant field. But in the end, this still results in rows with an extra column. The 4th column will just be empty by doing this..

I would like to let every field 'shift' to the left when the field is deleted. Is it possible to manipulate this? Or do I need an extra function which processes the output and ignores all 'null' fields.

function readLines(file, options, cb){
  let results = [];
  fs.createReadStream(file)
  .pipe(csv(options))
  .on('data', (data) => {
    if(Object.keys(data).length == 59){
      data['2'] = data['2'] + data['3']
      delete data['3']
    }
    results.push(data)
  })
  .on('end', () => {
    cb(results)
  });
}
1

There are 1 best solutions below

0
On

I've fixed it by filtering the return object in the callback function:

cb(Object.values(results).map((r) => {
      return r.filter((x) =>{
        return x != null && x !== ""
      })
    }))

Probably not the most efficient, but the best I could come up with so far.