I'm trying to write from a data frame to CSV directly to an s3 bucket I've tried the stringIO method but the problem is that I run into the "KeyTooLong" error.
import boto3
client = boto3.client('s3')
client.create_bucket(Bucket = 'poolpo-rent-a-car-bucket')
# checking if the bucket was created
response = client.list_buckets()
response['Buckets']
bucket_name = 'poolpo-rent-a-car-bucket'
car_costs.to_csv(f"s3://{bucket_name}/{car_costs}.csv")
This is the StringIO one
from io import StringIO
bucket_name = 'poolpo-rent-a-car-bucket'
csv_buffer = StringIO()
branch_locations.to_csv(csv_buffer)
s3_resource = boto3.resource('s3')
s3_resource.Object(bucket_name, f'{branch_locations}.csv').put(Body=csv_buffer.getvalue())
And the error
ClientError: An error occurred (KeyTooLongError) when calling the PutObject operation: Your key is too long
These are medium size dataframes, like 5000 rows and like 3-5 columns
For an unrelated reason, I had to reinstall anaconda and the problems got away. Ended up using a way simpler approach.
One other thing that I noticed in s3 was that when I was using the f string to input the dataframe I was basically using the dataframe as a name hence why I was having the
KeyTooLongError