I have over 2000 url calls to make and with the code below it is taking almost 2 minutes to complete. Could someone help me to speed the process up?
private void button4_Click(object sender, EventArgs e)
{
WebRequest req;
WebResponse res;
string[] lines = File.ReadAllLines(@"c:\data\temp.txt");
for (int i = 0; i < lines.Count(); i++)
{
req = WebRequest.Create(lines[i]);
res = req.GetResponse();
StreamReader rd = new StreamReader(res.GetResponseStream(), Encoding.ASCII);
rd.Close();
res.Close();
textBox1.Text += ".";
}
}
Many thanks
You can't speed-up things much because bottleneck is your Internet connection. However there is something you can do:
1) Do not LINQ count lines, it's an array and its size is known (micro optimization, you won't ever notice this change).
2) Use
usingto release disposable objects (nothing to do with speed, better error handling: if something went wrong with your code you'll release resources with GC).3) Make them parallel. This will speed-up things little bit:
Few more notes:
MaxDegreeOfParallelismsets maximum number of concurrent requests. Multiple active concurrent connections won't speed-up things indefinitely and they may even slow things down. Some trials will help you to set this value to a reasonable value.There is not any error checking but network things may temporary go wrong but after a short delay they may work as expected. I suggest to also read System.Net.WebException: The remote name could not be resolved and this for I/O operations.
To make it a more complete example, your click even handler will be:
Actual code to process each URL and to read URL list:
With this helper method to hide try/wait pattern for network operations: