Possible: Infinispan cache putInChunks scala instead of all at once?

23 Views Asked by At

When having large files, my nodes start becoming unresponsive and upon looking at splunk logs I see inter node communication timing out. I am looking for uploading bytes to cache using chunks.

Like uploading cache in small chunks to the same key using infinispan in scala. How can I do so? My cache is a bytes file and here's how I am splitting the bytes:

def splitArray(data: Array[Byte], chunkSize: Int): Seq[Array[Byte]] = {
      val numOfChunks = (data.length + chunkSize - 1) / chunkSize
      (0 until numOfChunks).map { i =>
        data.slice(i * chunkSize, Math.min(data.length, (i + 1) * chunkSize))
      }
    }

I have turned FORCE_SYNCHRONOUS flag ON and want to use the same key for a given chunk of data to be uploaded to make sure that the entire bytes of data is uploaded against the same key.

Here's how my function looks like:

def cacheBytes(): Future[CacheResult] = {
    if (objStorage)
          cache.putForExternalRead(key, pack.copy(source = Source.Cache))
        else
          cache.putForExternalRead(key, pack.toByteArray)
    }

Ideally what behaviour I need is for a given payload (which is Array[Byte]), I need to perform multipart upload in synchronous fashion to the same key.

0

There are 0 best solutions below