parse and process XML big files

917 Views Asked by At

i am working on a standalone java application (spring boot) that parse & process several big xml files around 3 ~ 4 Go to generate one file that combine the data in the 3 files ( 1st file is the specs of products, 2nd the details of products, 3rd file someother information for the prodect ) so to get the full information for one node i have to read all the file;

my issue that we don't have RAM ( our clients ) so i tried exist db ( just load the file & write it ) , it's kind fast but the RAM usage still too hight for 1.5 Go xml file, it will consume 1.6 ~ 1.7 go so is there any solution that can lower the ram usage

Thanks in advance

1

There are 1 best solutions below

2
On BEST ANSWER

so the best solutions was to split the nodes,for each node i will generate a file which the filename is the id of the node that we will zip, so the next time i want to access a node it will be very fast cause the zip is indexed

output.zip :
--> id_nodes1
--> id_nodes2
--> id_nodes3
--> id_nodes4
--> ....

thanks all for your answer