I have a Google Chrome extension for exporting entries from a website. I can't get all entries by one query and I'm using a parameter for paging, i.e. '...&p=' + pageNumber++
. I do this with connecting to my web server and this is not a problem with PHP. But I found that I can generate pdf files directly using jsPDF and I decided to get rid of the server side help.
Now I want to fetch the entries with jQuery Ajax. The problem is that I don't know in advance how many pages I have to fetch.
My current solution is:
function runRecursionTest(url, pageNumber){
var currentUrl = url + '?p=' + pageNumber;
$.getJSON("http://query.yahooapis.com/v1/public/yql?"+
"q=select%20*%20from%20html%20where%20url%3D%22"+
encodeURIComponent(currentUrl)+
"%22&format=json'&callback=?"
).done(function(data) {
if(data.results[0]){
data = filterData(data.results[0]);
$('#export-target-list').append($(data).find('#entry-list').html());
runRecursionTest(url, pageNumber++);
} else {
alert('Stop! Last page: '+(pageNumber-1));
}
});
}
It seems logical, huh? But suprisingly it fails. Too much requests are sent even when I have 1 page only. Can't stop it.
Any ideas?