AWS S3 Angular 14 with Nodejs - Multi Part Upload sending the same ETag for every part

766 Views Asked by At

AWS S3 Angular 14 with Nodejs - Multi Part Upload sending the same ETag for every part

Backend Nodejs Controller Looks Like -

const AWS = require('aws-sdk');

const S3 = new AWS.S3({
    // endpoint: "http://bucket.analysts24x7.com.s3-website-us-west-1.amazonaws.com",
    // accessKeyId: S3_KEY,
    // secretAccessKey: S3_SECRET,
    // region: process.env.POOL_REGION,
    apiVersion: '2006-03-01',
    signatureVersion: 'v4',
    // maxRetries: 10
});


exports.startUpload = (req, res) => {
    try {
        const filesData = JSON.parse(JSON.stringify(req.files));
        const eachFiles = Object.keys(filesData)[0];
        console.log(filesData[eachFiles]);
        let params = {
            Bucket: process.env.STORE_BUCKET_NAME,
            Key: filesData[eachFiles].name,
            // Body: Buffer.from(filesData[eachFiles].data.data, "binary"),
            ContentType: filesData[eachFiles].mimetype
            // ContentType: filesData[eachFiles].data.type
        };
        return new Promise((resolve, reject) => {
            S3.createMultipartUpload(params, (err, uploadData) => {
                if (err) {
                    reject(res.send({
                        error: err
                    }));
                } else {
                    resolve(res.send({ uploadId: uploadData.UploadId }));
                }
            });
        });
    } catch(err) {
        res.status(400).send({
            error: err
        })
    }
}

exports.getUploadUrl = async(req, res) => {
   try {
        let params = {
            Bucket: process.env.STORE_BUCKET_NAME,
            Key: req.body.fileName,
            PartNumber: req.body.partNumber,
            UploadId: req.body.uploadId
        }
        return new Promise((resolve, reject) => {
            S3.getSignedUrl('uploadPart', params, (err, presignedUrl) => {
                if (err) {
                    reject(res.send({
                        error: err
                    }));
                } else {
                    resolve(res.send({ presignedUrl }));
                }
            });
        })
   } catch(err) {
        res.status(400).send({
            error: err
        })
   }
}

exports.completeUpload = async(req, res) => {
   try {
        let params = {
            Bucket: process.env.STORE_BUCKET_NAME,
            Key: req.body.fileName,
            MultipartUpload: {
                Parts: req.body.parts
            },
            UploadId: req.body.uploadId
        }
        // console.log("-----------------")
        // console.log(params)
        // console.log("-----------------")
        return new Promise((resolve, reject) => {
            S3.completeMultipartUpload(params, (err, data) => {
                if (err) {
                    reject(res.send({
                        error: err
                    }));
                } else {
                    resolve(res.send({ data }));
                }
            })
        })
    } catch(err)  {
        res.status(400).send({
            error: err
        })
    };
}

FrontEnd Angular 14 Code --

uploadSpecificFile(index) {
      const fileToUpload = this.fileInfo[index];
      const formData: FormData = new FormData();
      formData.append('file', fileToUpload);

      this.shared.startUpload(formData).subscribe({
        next: (response) => {
           const result = JSON.parse(JSON.stringify(response));
           this.multiPartUpload(result.uploadId, fileToUpload).then((resp) => {
              return this.completeUpload(result.uploadId, fileToUpload, resp);
           }).then((resp) =>  {
              console.log(resp);
           }).catch((err) => {
              console.error(err);
           })
        },
        error: (error) => {
           console.log(error);
        }
      })
    }

    multiPartUpload(uploadId, fileToUpload) {
      return new Promise((resolve, reject) => {
        const CHUNKS_COUNT = Math.floor(fileToUpload.size / CONSTANTS.CHUNK_SIZE) + 1;
        let promisesArray = [];
        let params = {};
        let start, end, blob;
        for (let index = 1; index < CHUNKS_COUNT + 1; index++) {
          start = (index - 1) * CONSTANTS.CHUNK_SIZE
          end = (index) * CONSTANTS.CHUNK_SIZE
          blob = (index < CHUNKS_COUNT) ? fileToUpload.slice(start, end) : fileToUpload.slice(start);
          // blob.type = fileToUpload.type;

          params = {
            fileName: fileToUpload.name,
            partNumber: index,
            uploadId: uploadId
          }

          console.log("Start:", start);
          console.log("End:", end);
          console.log("Blob:", blob);
          

          this.shared.getUploadUrl(params).subscribe({
            next: (response) => {
              const result = JSON.parse(JSON.stringify(response));
              
              // Send part aws server
              const options = {
                headers: { 'Content-Type': fileToUpload.type }
              }
  
              let uploadResp = axios.put(result.presignedUrl, blob, options); 
             
              promisesArray.push(uploadResp);

              if(promisesArray.length == CHUNKS_COUNT) {
                resolve(promisesArray)
              }
            },
            error: (error) => {
              console.log(error);
              reject(error);
            }
          })
        }
      })
    }

    async completeUpload(uploadId, fileToUpload, resp) {
      let resolvedArray = await Promise.all(resp)
     
      let uploadPartsArray = [];

      console.log("I am etag -----");
      console.log(resolvedArray);

      resolvedArray.forEach((resolvedPromise, index) => {
        uploadPartsArray.push({
          ETag: resolvedPromise.headers.etag,
          PartNumber: index + 1
        })
      })

      // Complete upload here
      let params = {
        fileName: fileToUpload.name,
        parts: uploadPartsArray,
        uploadId: uploadId
      }
      return new Promise((resolve, reject) => {
        this.shared.completeUpload(params).subscribe({
          next: (response) => {
            resolve(response);
          },
          error: (error) => {
            reject(error);
          }
        })
      })

    }

What I am trying to do --

  1. Initiate a multipart upload ( API - /start-upload ) --> to get the uploadId
  2. Upload the object’s parts ( API - /get-upload-url ) --> to get the presignedUrl
  3. Call the Presigned URL and put blob as part --- To get the Etag
  4. Complete multipart upload ( API - /complete-upload ) --> to send the complete parts.

**Sample Example of code --- ** FrontEnd -- https://github.com/abhishekbajpai/aws-s3-multipart-upload/blob/master/frontend/pages/index.js

BackEnd -- https://github.com/abhishekbajpai/aws-s3-multipart-upload/blob/master/backend/server.js

Attach the screenshot below how the API call looks like --

enter image description here

Now the problem here, Each and everytime I am getting same Etag from the -- above 3 steps while I am calling presignedURL using Axios. For that reason, I am getting the error in the final upload ---

Your proposed upload is smaller than the minimum allowed size

enter image description here

**Note -- **

Each and every chuck size I am uploading

CHUNK_SIZE: 5 * 1024 * 1024, // 5.2 MB

Apart from last part.

Also all the API are giving success response, apart from /complete-upload. Because all the API giving same Etag.

Same question also asked here, but there are no solutions -- https://github.com/aws/aws-sdk-java/issues/2615

Any idea about this ? How to resolve it ?

This is so uncommon problem, Provide me the solution of the problem.

0

There are 0 best solutions below