Why I get 416 error when uploading +100mb files to Objects Api

217 Views Asked by At

this is my first time trying to upload a 3DModel larger than 100mb with autodesk and I'm getting 416 error Chunk-Length mismatch (found: 52566915, required:52428801)

I'm using node.js and ObjectsApi from forge-apis

I'm pretty sure there's something wrong in my code, I don't understand well how to do it since I have readed the docs, and other simmilar questions.

My code:

/routes/upload/uploadLarge.js

import { UploadLarge } from '../../TS/routes/upload'
import fs from 'fs'
import { twoLeggedOauth } from '../../auth/forgeAuth'
import { ObjectsApi } from 'forge-apis'
const objectApi = new ObjectsApi()

const uploadLarge: UploadLarge = (req, res) => {
  console.log('bucket key', req.params.bucket)

    const totalBytes = fs.statSync(req.file.path).size
    const bucketKey = req.params.bucket
    const fileName = req.file.originalname
    const sessionId = 'test-session-id'

  fs.readFile(req.file.path, async (err, filecontent) => {
   try {
     const contentRange = `bytes 0-${totalBytes / 2}/${totalBytes}`

     const { fToken, oauth } = await twoLeggedOauth()

     console.log('chunks', contentRange)
     const upload = await objectApi.uploadChunk(
       bucketKey,
       fileName,
       totalBytes,
       contentRange,
       sessionId,
       filecontent,
       {},
       oauth,
       fToken
     )
     res.send(upload.body)
   } catch (error) {
     res.send('Failed to create a new object in the bucket')
   }
  })
}

routes/upload/index.js

import multer from 'multer' // https://github.com/axios/axios/issues/1221
import { Router } from 'express'
import { uploadLarge } from './uploadLarge'
import { uploadSmall } from './uploadSmall'

const uploadRouter = Router()
const upload = multer({ dest: 'tmp/' })


uploadRouter.post(
  '/upload-large/:token/:bucket',
  upload.single('fileToUpload'),
  uploadLarge
)
1

There are 1 best solutions below

0
On

You just have to read different sections of the file ...

const readStream = fs.createReadStream(
  file.path, { start, end }
)

... and upload each of those sections using the uploadChunk() function using the same sessionId and the range following the rule "bytes [start]-[end]/[full file size]", e.g. "bytes 0-9/20" & "bytes 10-19/20"

Here is an article on this topic, also talking about the error you're getting: https://forge.autodesk.com/blog/nailing-large-files-uploads-forge-resumable-api