I am using Serverless framework with the serverless-s3-local plugin to test my code during development. However, despite being in offline mode, the real S3 bucket is being written to. How can I alter my configuration to use a local fake s3 bucket when in offline mode?
Relevant serverless.yml sections:
plugins:
- serverless-stack-output
- serverless-plugin-include-dependencies
- serverless-layers
- serverless-deployment-bucket
- serverless-s3-local
- serverless-offline
custom:
#...
s3:
bucketName: test-s3-buck
host: localhost
serverless-offline:
ignoreJWTSignature: true
httpPort: 4000
noAuth: true
directory: /tmp
resources:
Resources:
#...
Bucket:
Type: AWS::S3::Bucket
Properties:
BucketName: ${self:custom.s3.bucketName}
Endpoint Calling S3:
import boto3
def post(event, context):
s3_path = "/test.txt"
body = "test"
encoded_string = body.encode("utf-8")
s3 = boto3.resource("s3")
bucket_name = "test-s3-buck"
s3.Bucket(bucket_name).put_object(Key=s3_path, Body=encoded_string)
response = {
"statusCode": 200,
"body": "Created."
}
return response
Launching Serverless Offline:
serverless offline start
on the readme file in serverless-s3-local we have:
you can achieve the same with
boto:which means, when you run your
serverless offline startyou need to set the aws access key id toS3RVERand aws secret access key toS3RVER, otherwise, the real bucket will be used.also in the readme, there's instructions to setup a
s3localaws profile, https://github.com/ar90n/serverless-s3-local#triggering-aws-events-offlineanother way to achieve it is to run your command with environment variables:
in that way, the aws-sdk inside your code will read the correct values for the offline mode