Measuring time spent while sending data to client?

597 Views Asked by At

I have tried measuring time for socket send time:

stopWatch.Start();
socket.Send(buffer);
socket.Close();
stopWatch.Stop();

But ~95% of the times, stopwatch returns time that is <1second (with 5mb and bigger buffer). After running socket.send() within .5-1 milliseconds is says that file was send, even thou user is still downloading. How could I change my code so correct upload time will be returned? Thank you.

3

There are 3 best solutions below

0
On

I just realized that Marc might be indeed correct. Of course your code is not enough to provide any guidence one way or another.

0
On

You could try:

DateTime start = DateTime.Now;

socket.Send(buffer);
socket.Close();

TimeSpan span = DateTime.Now.Subtract(start);
double msec = span.TotalMilliseconds;
0
On

MarcB seems to be correct. On top of that, make sure to get more precision from your stopwatch e.g.

double seconds = (double)stopWatch.ElapsedTicks / Stopwatch.Frequency;