I am using a multiparser to send http POST request and I need to send a big file (4Go). However when I test my code it give me a bad_alloc exception and crashes. I tried it with smaller files and it worked for files smaller than 500Mo, but the bigger the files get the more it crashes randomly. Can you help me?
Here is the code where the crash occurs. It is when it buids the body of the request:
std::ifstream ifile(file.second, std::ios::binary | std::ios::ate); //the file I am trying to send
std::streamsize size = ifile.tellg();
ifile.seekg(0, std::ios::beg);
char *buff = new char[size];
ifile.read(buff, size);
ifile.close();
std::string ret(buff, size); //this make the bad_alloc exception
delete[] buff;
return ret; //return the body of the file in a string
Thanks
std::string ret(buff, size);
creates a copy ofbuff
. So you essentially double the memory consumptionhttps://www.cplusplus.com/reference/string/string/string/
Then the question becomes how much you actually have, respectively how much your OS allows you to get (e.g.
ulimit
on linux)As the comments say you should chunk your read and send single chunks with multiple POST requests.
You can loop over the
ifstream
check ififile.eof()
is reached as exit criteria for your loop:You need to consider error handling and such to not leak
buff
or leaveifile
open.