Jupyter Notebook (only) Memory Error, same code run in a conventional .py and works

105.9k Views Asked by At

I have an assignment for a Deep Learning class, and they provide a Jupyter notebook as a base code, the thing is that after running the data import and reshape, jupyter notebook through a "Memory Error", after some analysis y tried to compile the same code in a normal .py file, and everything runs well.

The thing is that I'm required (preferably) to use the Jupyter notebook as the base for development, since is more interactive for the kind of task.

<ipython-input-2-846f80a40ce2> in <module>()
  2 # Load the raw CIFAR-10 data
  3 cifar10_dir = 'datasets\\'
----> 4 X, y = load_CIFAR10(cifar10_dir)

C:\path\data_utils.pyc in load_CIFAR10(ROOT)
     18     f = os.path.join(ROOT, 'cifar10_train.p')
     19     print('Path:  ' + f );
---> 20     Xtr, Ytr = load_CIFAR_batch(f)
     21     return Xtr, Ytr
     22 

C:\path\data_utils.pyc in load_CIFAR_batch(filename)
     10         X = np.array(datadict['data'])
     11         Y = np.array(datadict['labels'])
---> 12         X = X.reshape(-1, 3, 32, 32).transpose(0,2,3,1).astype("float")
     13         return X, Y
     14 

MemoryError: 

The error occurs in the line 12, i know is a memory consuming assignment, but that doesn't mean that 4 GB of RAM wont suffice, and that was confirmed when the code run without problems outside Jupyter.

My Guess is it has something to do with the memory limit either by Jupyter or by Chrome, but I'm not sure and also dont know how to solve it.

By the way:

  • I have a Windows 10 laptop with 4GB of RAM
  • and Chrome Version 57.0.2987.133 (64-bit)
7

There are 7 best solutions below

1
On

Apparently this happens when the python installation is not the best.

As a matter of fact before solving the problem, I had installed on windows manually python 2.7 and the packages that I needed, after messing almost two days trying to figure out what was the problem, I reinstalled everything with Conda and the problem was solved.

I guess Conda is installing better memory management packages and that was the main reason.

0
On

Similar thing happened with me while loading .npy file. Freeing up RAM solved the issue. It didn't have enough memory to load file into variables. Actually, both firefox and chrome was running on my system and closing firefox solved the problem.

Useful Commands:free -h Note of precaution: before interpreting this command on your own. Its highly recommended to go through this page: https://www.linuxatemyram.com/ .

2
On

I am only a year and 2 months late to this question. The technical answer as to why is really nicely explained here: https://superuser.com/questions/372881/is-there-a-technical-reason-why-32-bit-windows-is-limited-to-4gb-of-ram

It also implies why the conda solution works.

But for a lazy engineer's no-change workaround, close the Chrome tabs not absolutely necessary and restart your kernel so it starts afresh.

Kernel > Restart (& Run All)
0
On

I had the same issue with Jupyter when the dataset that was used contained millions of rows. I tried multiple options : Option 1: Delete the unwanted variables and do the garbage cleasning using gc.collect eg: If there are unused variables v1,v,2,v3 the run the command del v1 del v2 del v3 etc after that import garbagecollecter
import gc gc.colletc() This helps to speed up a little but still the memory leakage issue persisted

Option 2: That worked for me , enable Jupyter Add in in VS Code. Open the ipny file in VS Code . VS Code is standalone and is able to avoid memory leakage.

Hope that helps

0
On

If you use Chrome for your notebooks, go to Setting-->Performance-->Memory saver and turn on that setting. It solved my problem. You can also install Firefox if that's not working.

0
On

Try running with Administrator privileges. Worked for me.

1
On

You can either reduce your dataset for training and testing this can solve your memory error problem.