I've been working with FastAPI and I've noticed that distributing large files is an issue. While I can upload them to a FastAPI server:

    async with aiofiles.open(temp_file_path, 'wb') as f:
        while chunk := await file.read(CHUNK_SIZE):
            await f.write(chunk)

I cannot manage to distribute the files.

For distributing:

file = open(filename, "rb", buffering=10485760)
headers = {'Content-Disposition': 'attachment; filename="' + filename}
return StreamingResponse(content=io.BytesIO(file.read()), media_type="application/octet-stream", headers=headers)
# For context, StreamingResponse is from starlette.responses.StreamingResponse

I tried using the buffering argument, however it looks like this argument just doesn't manage to work.

How can I return a file procedurally so that all of it isn't loaded into memory at once?

1

There are 1 best solutions below

0
On

Most definitely not my best and brightest moment!!!! FileResponse takes a path parameter, which downloads millions of times faster. If I download the file, then specifying a path and simply letting FastHTTP serve the file works.