The scraper I wrote runs perfectly on my PC, A windows OS that runs node.js v14.4.0.
But when I have tried to run it on Digital Ocean Droplet, Ubunto machine, I get for some of the pages the following error: Page crashed!
with not much information.
Here is the code for printing the error:
const handleClose = async (msg) =>{
console.log(msg)
page.close();
browser.close();
process.exit(1);
}
process.on("uncaughtException", (reason, p) => {
const a = `Possibly Unhandled Exception at: Promise , ${p}, reason: , ${reason}`
handleClose(a);
});
How do I tackle this one? And what could cause it? as it works wonderfull on my Windows PC.
I have added all memory configurations that I found online and related:
But that didn't help.
Thanks to pguardiario note, I simply upgrade Droplet from 1G RAM to 2G. And that did the trick.
I find it strange that to scrape a simple website it takes more than 1G, so I guess Puppeteer takes a lot of resources to run.
UPDATE I had anther page crush, but this time it was related to the server utilazing all the memory. So I removed all this arges from Puppeteer:
And where left only with the basic ones:
And it's now stable. So, I guess this needs to be used carefully and removed if not really needed.