I am writing an app to scrape my website for keywords. I am running into an issue where if I try to read a page that does not follow through I get a 404 (not a problem).
However when my app encounters the 404 web exception it does not continue. I know I need a try and a catch but i can't quite figure out how to add the 404 webexception catch properly without errors. Here is the code snippet with the issue: I think my problem is that I do not know where to properly put the try/catch as i have return source; I get an error that it is not returning a value when i add a catch.
string url = string.Format(@"http://127.0.0.1/website/" + searchterm);
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(url);
req.Method = "GET";
req.UserAgent = "MoZilla/5.0 (Windows; U; MSIE 9.0; WIndows NT 9.0; en-US))";
Uri target = req.RequestUri;
string source;
using (StreamReader reader = new StreamReader(req.GetResponse().GetResponseStream()))
{
source = reader.ReadToEnd();
}
return source;
Thanks to Alex K:
I forgot to declare
source = "";
at the top instead of bottom.Thank you Alex K!