Send Request Until Get Response

62 Views Asked by At

How can I keep fetching the webpage while it didn't get response being in a loop?

for (node of NodeList) {
  const url = node.getAttribute('href')
  const res = await fetch(url) //Code below won't execute if no response
  const html = await response.text()
  const scraper = new DOMParser()
  const doc = scraper.parseFromString(text, 'text/html')
  alert('successfully parsed')
}

So as it works in a loop and sends very many requests at a time, even a small and short issue with network can spoil the whole process, what can I do?

1

There are 1 best solutions below

4
mplungjan On

I never loop AJAX, it can overwhelm the server and the browser (if in a browser) will not have time to update the interface.

I would try something like this

the asynchronicity is solved by only calling the function again when success or error

const urls = [...NodeList].map(node => node.getAttribute('href'));
const docs = [];
const max = urls.length;
let cnt = 0;
const getHtml = () => {
  if (cnt >= max) {
    processAllDocs(); // here we have them all
    return;
  }
  try {
    const res = fetch(url) //Code below won't execute if no response
      .then(response => {
        if (!response.ok) {
          throw new Error(`HTTP error! Status: ${response.status}`);
        }
        return response.text()
      })
      .then(html => {
        const scraper = new DOMParser()
        const doc = scraper.parseFromString(text, 'text/html')
        docs.push(doc)
        cnt++
        getHtml(); // get next - use setTimeout if you want to throttle
  } catch (err) {
    console.log(err.message);
    if (err.message.includes("something really bad, please stop")) return; // stop all processing (server down or ten 500 errors in a row)
    getHtml(); // get same - use setTimeout if you want to throttle
  }
}
getHtml();