Reading S3 file content fails using Chalice Framework AWS

602 Views Asked by At

I am trying to read contents of S3 file that I wrote to earlier. When I read from the local boto3 script I can see the content of the file but when I run the code on Lambda using Chalice I get "NotFoundError"

Here is the code for Chalice App

from chalice import Chalice
import boto3
import json
from botocore.exceptions import ClientError
from chalice import NotFoundError

app = Chalice(app_name='scraper-one')
app.debug = True

s3 = boto3.resource('s3')
bucket = 'scraper-one'
key = 'csv'


@app.route("/")
def index():
    return {"hello": "world"}


@app.route('/{content}', methods=['GET', 'PUT'])
def to_s3(content):
    request = app.current_request
    if request.method == 'PUT':
        s3.put_object(bucket=bucket, Key=key, Body=content)
        return {"content": content}

    elif request.method == 'GET':
        try:
            obj = s3.Object(bucket, key)
            return json.loads(obj.get()['Body'].read().decode('utf-8'))
        except ClientError as e:
            raise NotFoundError(key)

Here is the local script which works fine.

import boto3

s3 = boto3.resource('s3')
bucket = 'scraper-one'
key = 'csv'
obj = s3.Object(bucket, key)
print(obj.get()['Body'].read().decode('utf-8'))

Any ideas on what am I missing?

1

There are 1 best solutions below

0
On

Once verify the access levels for the lambda functionality.I faced the same issue but giving the lambda function s3 access solved the issue and also verify whether both the lambda and the s3 file are in same s3 region.