How to feed large NumPy arrays to tf.fit()

127 Views Asked by At

I have two NumPy arrays saved in .npy file extension. One contains x_train data and other contains y_train data.

The x_train.npy file is 5.7GB of size. I can't feed it to the training by loading the whole array to the memory.

Every time I try to load it to RAM and train the model, Colab crashes before starting the training.

Is there a way to feed large Numpy files to tf.fit()

files I have:

  • "x_train.npy" 5.7GB
  • "y_train.npy"
1

There are 1 best solutions below

1
On

Depending on how much RAM your device has, it may not be possible from a hardware point of view.