I'm using siege to locate some problem pages on our new sitemap and am having trouble getting it to stop after it runs through the urls.txt file. I have tried using reps=once in the command line, as well as in the .siegerc config file. I find that I have to use the config file, as I want the output written verbosely to a log file so that I can see page load times, 302 and 404 errors, etc. and import them into excel. However, no matter what I try I cannot get siege to stop when it completes the url.txt file- it just reruns it over again. I have configured for 40 concurrent users, the time and reps variable is commented out in config, the url.txt file is in config. The syntax I run at cmd line is... sudo siege --reps=once -v > outputfile.csv
I have tried setting the reps in config, no luck. Any ideas where I'm going wrong?
I ran into similar problems and trying multiple options I got it to work with:
where urls.txt is a simple list of URLs like
The logs were written into a file specified in the siegerc file. ${HOME}/var/siege.log
}
I also observed that the logfile option is either buggy or very strict. '-l filename.log' does not work.
But --log=filename.log works. e.g.
Hope this helps.