I know similar questions have been asked, but I'm not sure about the answers (and I can't easily test all of them), so before I go crazy continuing to search, I want to ask: Is there an easy way to crawl all the pages on a website and check them for broken and invalid links automatically? Prefereably I'd like a solution that does not require an install or compile as I'm severely limited. Thanks.
Check Entire Website's Links
378 Views Asked by Tom A At
3
There are 3 best solutions below
0

I'm the author of Xenu's Link Sleuth :-) In the old days, it didn't have an install because this wasn't a priority for me. More and more people complained and one of them pointed me to a tool, so it's with an installer now, first it was InnoSetup, now it's NSIS.
"gablin" is correct, the .EXE file can be transferred. Nevertheless, I'd warn you against starting an "external" software within a "nasty" company. Rather, try to beg your way through the internal bureaucracy.
You could use Xenu's Link Sleuth. It does require an install, but is light-weight and no compile is needed.
http://home.snafu.de/tilman/xenulink.html