jupyter notebook takes forever to open and then pages unresponsive - [MathJax] issue

39k Views Asked by At

I'm trying to open a jupyter notebook and it takes a long time and I see at the bottom it's trying to load various [MathJax] extension, e.g. at the bottom left of the chrome browser it says:

Loading [MathJax]/extensions/safe.js

Eventually, the notebook loads, but it's frozen and then at the bottom left it keeps showing that it's trying to load other [MathJax] .js files.

Meanwhile, the "pages unresponsive do you want to kill them" pop up keeps popping up.

I have no equations or plots in my notebook so I can't understand what is going on. My notebook never did this before.

I googled this and some people said to delete the ipython checkpoints. Where would those be? I'm on Mac OS and using Anaconda.

6

There are 6 best solutions below

6
profhoff On BEST ANSWER

I had a feeling that the program in my Jupyter notebook was stuck trying to produce some output, so I restarted the kernel and cleared output and that seemed to do the trick!

If Jupyter crashes while opening the ipynb file, try "using nbstripout to clear output directly from the .ipynb file via command line"(bndwang). Install with pip install nbstripout

0
user9106677 On

Here restarting your kernel will not help. Instead use nbstripout to strip the output from command line. Run this command -> nbstripout FILE.ipynb Install nbstripout if it is not there https://pypi.org/project/nbstripout/

1
baligoyem On

I was having the same problem with jupyter notebook. My recommendations to you are as follows:

First, check the size of the .ipynb file you are trying to open. Probably the file size is in MB and is large. One of the reasons for this might be the output of a dataset that you previously displayed all rows.

For example; In order to check the dataset, sometimes I use pd.set_option('display.max_rows', None) instead of the .head() function. And so I view all the rows in the data set. The large number of outputs increases the file size, making the notebook slower. Try to delete such outputs.

I think this will solve your problem.

3
Anas On
  1. conda install -c conda-forge nbstripout

  2. nbstripout filename.ipynb. Make sure that there is no whitespace in the filename.

0
Angela Carraro On

It happened to me the time I decided to print a matrix for 100000 times. The notebook file became 150MB and Jupyter (in Chrome) was not able to open it: it said all the things you experienced and then the page died saying it was "OutOfMemory".

I solved the issue opening it in Visual Studio Code, there is a button "Clear All Output", then I saved the notebook again and it was back to some hundreds of KB, which I could open normally.

If you don't have Visual Studio Code installed, you can open the notebook with another editor (gedit if you use Linux or Notepad++ in Windows) and try to delete the output cells. This is more tricky since you have to pay a lot of attention in what you are deleting, otherwise the notebook will stop working.

0
Dr S Meenakshi On

I got the same issue. The file size became 150 MB and the message displayed as "Out Of Memory". I solved the problem using "PyCharm". Please use the following steps:

  1. Download the file from Jupyter Notebook and open it in "PyCharm" editor in "Light Edit Mode".
  2. In "Light Edit Mode", You can get the code cells easily than the "Notepad++".
  3. you can copy and paste and save the code in a new file. Now, the memory size will be reduced. You can load and run the program easily in Jupyter Notebook.