I am trying to multipart upload files(which can get very large) to AWS S3 using the Upload from library lib-storage. Unfortunately, I am receiving this error. The S3 bucket is accessed through the credentials of an IAM user.

TypeError: Cannot read properties of undefined (reading 'done')
    at getChunkStream.js:5:5
    at Generator.next (<anonymous>)
    at resume (chunk-Y34XHC4O.js?v=a9de758f:93:27)
    at chunk-Y34XHC4O.js?v=a9de758f:99:63
    at new ZoneAwarePromise (zone.js:1422:21)
    at it.<computed> [as next] (chunk-Y34XHC4O.js?v=a9de758f:99:38)
    at Upload.<anonymous> (Upload.js:114:9)
    at Generator.next (<anonymous>)
    at chunk-Y34XHC4O.js?v=a9de758f:83:61
    at new ZoneAwarePromise (zone.js:1422:21)

Here is my code, which is more or less copied from here

  async multipartUpload(event: any){
    var file = event.target[0].files[0]

    var creds = {accessKeyId:this.ACCESS_KEY, secretAccessKey:this.SECRET_ACCESS_KEY}
    var target = {Bucket:this.BUCKET_NAME, Key: file.name, Body:file}

    var s3Client = new S3Client({region:"ap-south-1", credentials:creds})

    try {
      const parallelUploads3 = new Upload({
        client: s3Client,
        params:target,
      });
    
      parallelUploads3.on("httpUploadProgress", (progress) => {
        console.log(progress);
      });
      await parallelUploads3.done();
    } catch (e) {
      console.log(e);
    }
  }

I verified that the 's3Client' object was working by uploading a file using the 'PutObjectCommand', proving that this is not an issue of credentials like Access Keys, Bucketname etc.

0

There are 0 best solutions below