Indexing 7TB of data with elasticsearch. FScrawler stops after sometime

236 Views Asked by At

I am using fscrawler to create an index of data above 7TB. The indexing starts fine but then stops when the index size gets to 2.6gb. I believe this is a memory issue, how do I configure the memory?

My machine memory is 40GB and I have assigned 12GB to elasticsearch.

enter image description here

1

There are 1 best solutions below

0
On

You might have to assign as well enough memory to FSCrawler using FS_JAVA_OPTS. Like:

FS_JAVA_OPTS="-Xmx4g -Xms4g" bin/fscrawler