how to make formidable not save to var/folders on nodejs and express app

2.1k Views Asked by At

I'm using formidable to parse incoming files and store them on AWS S3

When I was debugging the code I found out that formidable is first saving it to disk at /var/folders/ and overtime some unnecessary files are stacked up on disk which could lead to a big problem later.

It's very silly of me using a code without fully understanding it and now

I have to figure out how to either remove the parsed file after saving it to S3 or save it to s3 without storing it in disk. But the question is how do I do it?

I would appreciate if someone could point me in the right direction

this is how i handle the files:

import formidable, { Files, Fields } from 'formidable';

const form = new formidable.IncomingForm();
form.parse(req, async (err: any, fields: Fields, files: Files) => {
let uploadUrl = await util
            .uploadToS3({
                file: files.uploadFile,
                pathName: 'myPathName/inS3',
                fileKeyName: 'file',
            })
            .catch((err) => console.log('S3 error =>', err));
}
2

There are 2 best solutions below

2
On BEST ANSWER

This is how i solved this problem:

When I parse incoming form-multipart data I have access to all the details of the files. Because it's already parsed and saved to local disk on the server/my computer. So using the path variable given to me by formidable I unlink/remove that file using node's built-in fs.unlink function. Of course I remove the file after saving it to AWS S3.

This is the code:

import fs from 'fs';
import formidable, { Files, Fields } from 'formidable';

const form = new formidable.IncomingForm();
form.multiples = true;

form.parse(req, async (err: any, fields: Fields, files: Files) => {
   const pathArray = [];
   try {
      const s3Url = await util.uploadToS3(files);
      // do something with the s3Url
      pathArray.push(files.uploadFileName.path);
   } catch(error) {
      console.log(error)
   } finally {
      pathArray.forEach((element: string) => {
         fs.unlink(element, (err: any) => {
            if (err) console.error('error:',err);
            });
        });
   }
})

I also found a solution which you can take a look at here but due to the architecture if found it slightly hard to implement without changing my original code (or let's just say I didn't fully understand the given implementation)

4
On

I think i found it. According to the docs see options.fileWriteStreamHandler, "you need to have a function that will return an instance of a Writable stream that will receive the uploaded file data. With this option, you can have any custom behavior regarding where the uploaded file data will be streamed for. If you are looking to write the file uploaded in other types of cloud storages (AWS S3, Azure blob storage, Google cloud storage) or private file storage, this is the option you're looking for. When this option is defined the default behavior of writing the file in the host machine file system is lost."

const form = formidable({
    fileWriteStreamHandler: someFunction,
  });

EDIT: My whole code

import formidable from "formidable";
import { Writable } from "stream";
import { Buffer } from "buffer";
import { v4 as uuidv4 } from "uuid";


export const config = {
  api: {
    bodyParser: false,
  },
};

const formidableConfig = {
  keepExtensions: true,
  maxFileSize: 10_000_000,
  maxFieldsSize: 10_000_000,
  maxFields: 2,
  allowEmptyFiles: false,
  multiples: false,
};

// promisify formidable
function formidablePromise(req, opts) {
  return new Promise((accept, reject) => {
    const form = formidable(opts);

    form.parse(req, (err, fields, files) => {
      if (err) {
        return reject(err);
      }
      return accept({ fields, files });
    });
  });
}

const fileConsumer = (acc) => {
  const writable = new Writable({
    write: (chunk, _enc, next) => {
      acc.push(chunk);
      next();
    },
  });

  return writable;
};

// inside the handler

export default async function handler(req, res) {
  const token = uuidv4();
  try {
    const chunks = [];
    const { fields, files } = await formidablePromise(req, {
      ...formidableConfig,
      // consume this, otherwise formidable tries to save the file to disk
      fileWriteStreamHandler: () => fileConsumer(chunks),
    });

    // do something with the files
  
    const contents = Buffer.concat(chunks);

    const bucketRef = storage.bucket("your bucket");

    const file = bucketRef.file(files.mediaFile.originalFilename);

    await file
      .save(contents, {
        public: true,
        metadata: {
          contentType: files.mediaFile.mimetype,
          metadata: { firebaseStorageDownloadTokens: token },
        },
      })
      .then(() => {
        file.getMetadata().then((data) => {
        
          const fileName = data[0].name;

          const media_path = `https://firebasestorage.googleapis.com/v0/b/${bucketRef?.id}/o/${fileName}?alt=media&token=${token}`;
          console.log("File link", media_path);
          
        });
      });
  } catch (e) {
    // handle errors
    console.log("ERR PREJ ...", e);
  }
}