Preventing python script from using all ram

405 Views Asked by At

I use a jupyter notebook to execute a pyton script. The script calls the association_rules function from the mlxtend framework. Calling this function the ram literally explodes from 500 MB used to over 32 GB. But that would not be the problem. The Problem is if i execute the script locally on my windows 10 PC the ram maxes out but everything is still running. When i do the same on a unix server (Xfce) the server crashes. Is there something i can do prevent the server from crashing and to guarantee that the script continues?

Upadate: I basically missed the fact that windows is swapping ram all the time, the only difference is that windows does not crash. I'm quite sure this would be solved on linux by fixing the swapping configuration. So basically the question is obselete.

Update: I have made some wrong assumptions. The windows PC was already swapping, and the swapping partition went out of memory as well. So on all machine the same Problem appeared and all them crashed. In the end it was a mistake on the data preprocessing. Sorry for the unconvenience and please see this question as not relevant any more.

1

There are 1 best solutions below

0
On

Run the script using the nice command to assign priority.