CopyToAsync vs ReadAsStreamAsync for huge request payload

3.5k Views Asked by At

I have to compute hash for huge payload, so I am using streams not to load all request content in memory. The question is what are the differences between this code:

using (var md5 = MD5.Create())
using (var stream = await authenticatableRequest.request.Content.ReadAsStreamAsync())
{
    return md5.ComputeHash(stream);
}

And that one:

using (var md5 = MD5.Create())  
using (var stream = new MemoryStream())
{
    await authenticatableRequest.request.Content.CopyToAsync(stream);
    stream.Position = 0;

    return md5.ComputeHash(stream);
}

I expect the same behavior internally, but maybe I am missing something.

3

There are 3 best solutions below

0
On BEST ANSWER

The first version looks Ok, let the hasher handle the stream reading. It was designed for that.

ComputeHash(stream) will read blocks in a while loop and call TransformBlock() repeatedly.

But the second piece of code will load everything into memory, so don't do that:

using (var stream = new MemoryStream())
{
    await authenticatableRequest.request.Content.CopyToAsync(stream);
3
On

The second snippet will not only load everything into memory, it will use more memory than HttpContent.ReadAsByteArrayAsync().

A MemoryStream is a Stream API over a byte[] buffer whose initial size is zero. As data gets written into it, the buffer has to be reallocated into a buffer twice as large as the original. This can create a lot of temporary buffer objects whose size exceeds the final content.

This can be avoided by allocating the maximum expected buffer size from the beginning by providing the capacity parameter to the MemoryStream() constructor.

At best, this will be similar to calling :

var bytes = authenticatableRequest.request.Content.ReadAsByteArrayAsync();
return md5.ComputeHash(bytes);
0
On

I expect the same behavior internally,

Why? I mean, in one case you must load all into memory (because guess what, you define a memory stream). In the other case not necessarily.