I am using elasticsearch and fscrawler for searching about 7TB of data.. The process starts well until it just stalls after sometime. It must be running out of memory, I am trying to add my heap using https://fscrawler.readthedocs.io/en/latest/admin/jvm-settings.html but I keep getting the error invalid maximum heap size.
Is that the right way of setting up the heap? What am I missing?
I think you are using the 32 bit version of Java. If that's the case, you need to install the 64 bit JVM and make sure to update your JAVA_HOME to reflect the new version.
More detailed info can be found here.