Amazon Bedrock class can't load my credentials when called via Lambda function

4k Views Asked by At

So I created a lambda function for a script that essentially that allows a user to pass a query to amazon titan LLM on Amazon bedrock. Here is the content of my main.py file in my deployment package.

from langchain.llms.bedrock import Bedrock
import boto3
from langchain.retrievers import AmazonKendraRetriever
from langchain.chains import RetrievalQA
from langchain.prompts import PromptTemplate
import json
from botocore.exceptions import ClientError

def get_secret():
    secret_name = "kendraRagApp"

    # Create a Secrets Manager client
    session = boto3.session.Session()
    client = session.client(
        service_name='secretsmanager',
    )

    try:
        get_secret_value_response = client.get_secret_value(
            SecretId=secret_name
        )
    except ClientError as e:
        raise e

    # Decrypts secret using the associated KMS key.
    secret = get_secret_value_response['SecretString']
    return secret   
def qa(query):
    secrets = json.loads(get_secret())
    kendra_index_id = secrets['kendra_index_id']

    llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1', credentials_profile_name='default')
    llm.model_kwargs = {"maxTokenCount": 4096}
    

    retriever = AmazonKendraRetriever(index_id=kendra_index_id)
    
    prompt_template = """
    {context}
    {question} If you are unable to find the relevant article, respond 'I can't generate the needed content based on the context provided.'
    """
    
    PROMPT = PromptTemplate(
    template=prompt_template, input_variables=["context", "question"])
    
    chain = RetrievalQA.from_chain_type(
    llm=llm,
    retriever=retriever,
    verbose=True,
    chain_type_kwargs={
    "prompt": PROMPT
    }
    )
    
    return chain(query)

def handler(event, context):
    query = event['query']
    response = qa(query)
    if response.get("result"):
        return {
            'statusCode': 200,
            'body': response["result"]
        }
    else:
        return {
            'statusCode': 400,
            'body': "Could not answer the query based on the context available"
        }

The lambda function has been created successfully, but when I try to invoke it, I get the following validation error, apparently, Bedrock could not load my credentials for authentication.

{
  "errorMessage": "1 validation error for Bedrock\n__root__\n  Could not load credentials to authenticate with AWS client. Please check that credentials in the specified profile name are valid. (type=value_error)",
  "errorType": "ValidationError",
  "requestId": "b772f236-f582-4308-8af5-b5a418d4327f",
  "stackTrace": [
    "  File \"/var/task/main.py\", line 62, in handler\n    response = qa(query)\n",
    "  File \"/var/task/main.py\", line 32, in qa\n    llm = Bedrock(model_id=\"amazon.titan-tg1-large\", region_name='us-east-1',) #client=BEDROCK_CLIENT)\n",
    "  File \"/var/task/langchain/load/serializable.py\", line 74, in __init__\n    super().__init__(**kwargs)\n",
    "  File \"pydantic/main.py\", line 341, in pydantic.main.BaseModel.__init__\n    raise validation_error\n"
  ]

I have looked at the Bedrock class as defined here Bedrock class but couldn't find enough information on how to pass my credentials to the Bedrock class. Mind you, my code runs without issues from my Sagemaker notebook (I guess because authentication is handled automatically). I will appreciate any useful help. Thanks.

Edit: not using the credentials_profile_name parameter when calling the bedrock class does not fix it, also, calling the lambda function from a local environment with authentication set up does not resolve the issue either.

2

There are 2 best solutions below

4
On

The likely issue is that you haven't configured the AWS credentials on the machine you're using. As you pass credentials_profile_name='default' into the Bedrock constructor, it tries to load the credentials from the local default profile.

SageMaker notebooks do this automatically, but on most other machines you have to do this yourself.

In order to do this you need to do two things:

Having that said, you don't have to provide any specific credentials to Bedrock, it automatically uses boto3.Session() internally.

This means that if you have configured a boto3 session with the proper credentials, you don't need to pass credentials_profile_name='default' into the constructor.

If the boto3 Session has the required permissions, it should be sufficient to replace:

llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1', credentials_profile_name='default')

with:

llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1')
3
On

Try passing the bedrock client like this:

    llm1 = Bedrock(
        model_id="anthropic.claude-v1",
        model_kwargs={
            "temperature": 1,
        },
        region_name="us-east-1",
        client=bedrock_client,
    )

This link provides some more information on this issue.