`concurrent_session_invalid_data_size` when uploading a file with the Dropbox API

92 Views Asked by At

When uploading files to the Dropbox API with the filesUploadSessionAppendV2 endpoint I'm getting a concurrent_session_invalid_data_size error if I upload files larger than my CHUNK_SIZE.

Here's roughly what my node code looks like to upload files:

// readBytes and generateChunks adapted from https://betterprogramming.pub/a-memory-friendly-way-of-reading-files-in-node-js-a45ad0cc7bb6
function readBytes(fd, sharedBuffer) {
  return new Promise((resolve, reject) => {
    fs.read(fd, sharedBuffer, 0, sharedBuffer.length, null, (err) => {
      if (err) return reject(err);
      resolve();
    });
  });
}

async function* generateChunks(filePath, size) {
  const sharedBuffer = Buffer.alloc(size);
  const stats = fs.statSync(filePath); // file details
  const fd = fs.openSync(filePath); // file descriptor
  let bytesRead = 0;
  let end = size;

  for (let i = 0; i < Math.ceil(stats.size / size); i++) {
    await readBytes(fd, sharedBuffer);
    bytesRead = (i + 1) * size;

    if (bytesRead > stats.size) end = size - (bytesRead - stats.size);
    yield [
      sharedBuffer.slice(0, end),
      { bytesRead, size: stats.size, done: bytesRead >= stats.size },
    ];
  }
}

// make sure this is a multiple of 4MB
const CHUNK_SIZE = 40 * 1000000;

// get session_ids
const { result: { session_ids } } = await client.filesUploadSessionStartBatch({
  num_sessions: 1
});
const session_id = session_ids[0];

// upload chunks
const src = "file.zip";
let offset = 0;
for await (const [chunk, progress] of generateChunks(src, CHUNK_SIZE)) {
  await client.filesUploadSessionAppendV2({
    contents: chunk,
    cursor: {
      session_id,
      offset,
    },
    close: progress.done,
  });
  if (progress.done) offset = progress.size;
  else offset += CHUNK_SIZE;
}

// finalize
await client.filesUploadSessionFinishBatchV2({
  entries: [
    { cursor: { session_id, offset }, commit: { path: "/file.zip", mode: "add", autorename: true } }
  ]
});
1

There are 1 best solutions below

0
cgenco On

Turns out concurrent_session_invalid_data_size means "hey you didn't send this chunk in a multiple of 4MB".

Also 1MB !== 1000000 bytes; 1MB === Math.pow(2, 20) === 1048576 bytes.

The only thing I needed to change was:

- const CHUNK_SIZE = 40 * 1000000; 
+ const CHUNK_SIZE = 40 * Math.pow(2, 20);

‍♂️