OpenCV imwrite function issue with very large, memory mapped arrays

90 Views Asked by At

I am generating a huge image by using an algorithm, with a size of around 57743x547583 pixels. To be able to generate that image, I am using numpys memmap functionality; I am basically calculating 500x500 blocks and write them into the disk mapped array as the calculation progresses.

In the next step I want to convert this huge array into a JPG file. (The original file size is around 92 GBs). However, it seems that OpenCV when naively calling the imwrite function, it immediately returns False and does not even process the memmap image. For example, the code snippet below:

# dat_file_path points to the very large numpy memmap image on the disk.
world_image = np.memmap(dat_file_path, dtype='uint8', mode='r', shape=(57743, 547583, 3))
res1 = cv2.imwrite(os.path.join(root_folder, "world_image2.JPG"), world_image[0:10000, 0:10000, :])
print(res1)
res2 = cv2.imwrite(os.path.join(root_folder, "world_image2.JPG"), world_image)
print(res2)

The code above returns

True
False

and the first imwrite call does actually generate a JPG file. What is the correct way to deal such large images? If OpenCV is not capable, are there any suitable tools around for dealing that kind of very large data or are there any workaround in OpenCV as well?

0

There are 0 best solutions below