I have a web api that reads a file from azure and downloads it into a byte array. The client receives this byte array and downloads it as pdf. This does not work well with large files. I am not able to figure out how can I send the bytes in chunks from web api to client.
Below is the web api code which just returns the byte array to client:
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
blockBlob.FetchAttributes();
byte[] data = new byte[blockBlob.Properties.Length];
blockBlob.DownloadToByteArray(data, 0);
return report;
Client side code gets the data when ajax request completes, creates a hyperlink and set its download attribute which downloads the file:
var a = document.createElement("a");
a.href = 'data:application/pdf;base64,' + data.$value;;
a.setAttribute("download", filename);
The error occurred for a file of 1.86 MB.
The browser displays the message: Something went wrong while displaying the web page.To continue, reload the webpage.
The issue is most likely your server running out of memory on these large files. Don't load the entire file into a variable only to then send it out as the response. This causes a double download, your server has to download it from azure storage and keep it in memory, then your client has to download it from the server. You can do a stream to stream copy instead so memory is not chewed up. Here is an example from your WebApi Controller.
I have used (almost exactly) this code and have been able to download files well over 1GB in size.