I am reading an object from S3 using boto3. I am creating class 'bytes' object from it. It is around 520 MB file, I need to upload it to a new FTP server.
upload to FTP server uses multiupload session and can upload only 50 MB bute at max in one go.
How can I split this one big byte into multiple bytes or list of 50 MB bytes so that I can call multiupload session for each 50 MB byte.
s3_response_object=s3.get_object(Bucket='test', Key='prefix/file.test')
object_content = s3_response_object['Body'].read()
print(len(object_content))