How to solve out of memory error?

195 Views Asked by At

I am doing my project in OCR.For this i am using image size of 64x64 because when i tried 32x32 etc some pixels is lost.I have tried features such as zonal density, Zernike's moments,Projection Histogram,distance profile,Crossing .The main problem is feature vector size is too big .I have take the combination of above features and tried.But whenever i train the neural network ,i have got an error "out of memory". I have tried PCA dimensionality reduction but its not work good.i didn't get efficiency during training.Run the code in my PC and laptop.In both of them i have got same error.my RAM is 2GB.so i think about reducing the size of an image.is there any solution to solve this problem.

I have one more problem whenever i tried to train the neural network using same features result is varied.how to solve this also?

1

There are 1 best solutions below

0
On

It's not about the size of the image. A 64*64 image is sure not to blow your RAM. There must be bugs in your Neuron Network or other algorithms.

And please paste more details about your implementation. We don't even know what language you are using.