I have a large data set which contains 22 million records in JSON format and I use apoc.periodic.iterate and apoc.mongodb to import that from mongodb database to neo4j. after importing 3 million of records and occupying 6g of memory the connection to the server lost and heap size exception occurs. I changed the config file and set heap and page-cache, but that didn't take effect which is the main problem. And by the way running the code in browser and with python driver have the same result. although when I import data manually and import it in 2.5 million limits and then skip that in the next query execution and import the next 2.5 million batch, it works. I actually want to do this with python driver, but I couldn't simulate the manual way of handling it.
there is a error in log file that says: fatal error occurred during protocol handshaking... An established connection was aborted by the software in your host machine...
For the first part please send your configuration. and for the pyhton driver simple you can use cypher with list of map params: