I've written a simple test client (VB.Net) that uses the TcpClient class to send a small "command" byte array (6 or 7 bytes) to a hardware device, across the company network. The device responds with a byte array (approx 48kb) which my client app reads. I know the exact length of the returned array, so I repeatedly read the socket in a loop until I have it all:-
Dim read_data() As Byte
Dim stream As NetworkStream = tcpClient.GetStream()
' Send the "command"
stream.Write(data, 0, commandByteArray)
' Read the response
Using ms As New MemoryStream()
Dim sw As Stopwatch = New Stopwatch()
sw.Start()
Dim temp_read(16384) As Byte
Try
Do
Dim bytes_read = stream.Read(temp_read, 0, temp_read.Length)
ms.Write(temp_read, 0, bytes_read)
Loop While ms.Length < expectedResponseSize
Catch ex As IOException
' No more data
End Try
read_data = ms.ToArray()
Debug.WriteLine(sw.ElapsedMilliseconds)
End Using
This code is wired up to a button click event. Each time I run it, the stopwatch shows it takes an average of 5ms to read the 48kb byte array back. However if I repeatedly click the button several times per second, after a few seconds the time reported by the stopwatch starts to increase, and eventually settles at around 50ms. If I disconnect, reconnect and try again then the time returns to 5ms (until I start rapidly clicking the button again).
Any idea what's causing this? I'm assuming it's something in the network protocol or network hardware, e.g. something adapting the connection to the increased data throughput?
Edit: as mentioned in the comment below, I've tried disabling Nagling (socket.NoDelay = true
) but it had no effect, although my understanding is that this would be more likely to improve throughput of very small messages, not large ones like this.