FileSize Limit on Google Colab

10k Views Asked by At

I am working on APTOS Blindness detection challenge datasets from Kaggle. Post uploading the files; when I try to unzip the train images folder ; I get an error of file size limit saying the limited space available on RAM and Disk. Could any one please suggest an alternative to work with large size of image data.

2

There are 2 best solutions below

0
On

If you need more disk space, Colab now offers a Pro version of the service with double disk space available in the free version.

1
On

If you get that error while unzipping the archive, it is a disk space problem. Colab gives you about 80 gb by default, try switching runtime to GPU acceleration, aside from better performance during certain tasks as using tensorflow, you will get about 350 gb of available space.

From Colab go to Runtime -> Change runtime type, and in the hardware acceleration menu select GPU.