so basically I want to download all zip files on a given website using wget and I'm having a hard time. I'm new to this so please bear with me. The website DOES NOT have a page that list all the zip files. Is there a way I can have wget go through the entire site like a webcrawler and download all the zip files? I've tried commands like -
1) wget -r -np -l 1 -A zip http://site/path/
2) wget -A zip -m -p -E -k -K -np http://site/path/
3) wget --no-clobber --convert-links --random-wait -r -p -E -e robots=off -U mozilla http://site/path/
supposedly they search through the entire site, I haven't been getting those results though. Help or pointing me in the right direction would be very much appreciated!