scrapyd only runs 2 jobs at the same time

363 Views Asked by At

I was very happy with scrapy and scraped with max. 150 jobs at the same time. I am using rotating proxies with no problems. I changed max proc a few times, and scraped along with 10 jobs because the server i scrape is not that strong. After some time, scrapyd only runs 2 jobs at the same time. I didn't changed anything in the scrapyd config or something. This is my scrapyd config, stored in /etc/scrapyd/scrapyd.conf. Others conf files doesnt exists.

[scrapyd]
items_dir   = /root/scrapy/results/
max_proc    = 0
max_proc_per_cpu = 10
jobs_to_keep = 4000000

As said, i didnt changed it over time. I scraped with 10 jobs at the same time with this config. Scrapyd daemonstatus says

{"status": "ok", "running": 2, "finished": 100, "pending": 7, "node_name": "..."}

I dont understand, why scrapyd stops doing what i should. I did everything i can do to fix it. I runned scrapyd in other directorys, cleared the items and logs of the old jobs, but nothing changes. I double checked proxys and the spider code.

What could scrapyd stop running more jobs at the same time ? I'm really desperate :s

0

There are 0 best solutions below