I run a script which does text manipulation on the files system.
The script runs on text files ( .h, .cpp ).
As the script runs i see that the PF usage increases up until it reaches the amount of VM allocated for the page file.
Is there a way to flush the VM during the run or after it?
I have opend another question regarding it ( thought it was different issue ): Single sed command for multiple substitutions?
No, but maybe you can change the script to consume less memory.
Update. I have tried to reproduce the problem on Linux, corresponding to the script listed in the other question. In Bash:
I have used fragments of a protein sequence database (large text file, FASTA format, up to 74 MB) and short peptide sequences for the test (such that there was at least 10 replaces per file). While it is running no process is using any significant memory (as I would expect). CPU load is on the order of 50% while it is running. Thus I can not reproduce the problem.