How to correct OutofMemoryError while using Stanford OpenNLP engine?

155 Views Asked by At

I am using stanford's openNLP engine to find nouns in a collection of 30,000 documents and while doing that I am running into OutofMemory error in java, even though I am detecting all the nouns in specific sections of the documents, meaning I pass only a portion of text to the MaxentTagger in openNLP, what should I do to correct this error?

1

There are 1 best solutions below

0
On

Increase the memory allocation for your java program by using this switch:

-Xmx

eg

java -Xmx512m -jar yourfile.jar

it doesnt necessarily need to be inside the bounds of your RAM either, as long as you have some pagefile it can be 10gb+ if you have set your pagefile size to be that big.

Ideally you want to make whatever is eating memory better structured and reduce as much as possible.