Langchain is giving me NUMPY import error when I am creating a lambda layer via AWS CDK

226 Views Asked by At

I am using MAC M1 and creating a container image via AWS CDK python

Layer:

langchain_lambda_layer = _alambda.PythonLayerVersion(
    self,
    "langchain-lambda-layer",
    entry="./aws_bedrock_langchain_python_cdk/lambda/layer/langchain_latest/",
    compatible_runtimes=[_lambda.Runtime.PYTHON_3_11],
)

requirements.txt

langchain==0.0.315
boto3
botocore

Lambda:

langchain_bedrock_example_lambda = _lambda.Function(
    self,
    "langchain-bedrock-example-lambda",
    handler="index.lambda_handler",
    code=_lambda.Code.from_asset(
        "./aws_bedrock_langchain_python_cdk/lambda/code/langchain_example/"
    ),
    runtime=_lambda.Runtime.PYTHON_3_11,
    architecture=_lambda.Architecture.ARM_64,
    role=lambda_role,
    layers=[langchain_lambda_layer],
    timeout=Duration.seconds(300),
    memory_size=1024,
)

Lambda Code:

from langchain.prompts import PromptTemplate
from langchain.llms import Bedrock
from langchain.chains import LLMChain

def lambda_handler(event, context):
    case_study = "Machine Learning engineer"  ## Software Developer, Web developer, Husband hahaha

    claude = Bedrock(
        model_id="anthropic.claude-v1",
    )
    claude.model_kwargs = {'temperature': 0.3, 'max_tokens_to_sample': 4096}
    
    template = """
    Human: How to be a good {case_study}?  \n Assistant:
    """
    
    prompt_template = PromptTemplate(
        input_variables=["case_study"],
        template=template
    )
    
    llm_chain = LLMChain(
        llm=claude, verbose=True, prompt=prompt_template
    )
    
    results = llm_chain(case_study)
    print(results["text"])

    return {
        'statusCode': 200,
        'case_results': results["text"]
    }

Getting below error:

**{
  "errorMessage": "Unable to import module 'index': Error importing numpy: you should not try to import numpy from\n        its source directory; please exit the numpy source tree, and relaunch\n        your python interpreter from there.",
  "errorType": "Runtime.ImportModuleError",
  "requestId": "977004a7-f205-481a-8b53-7a4b5bf48d2b",
  "stackTrace": []
}**

This code executes perfectly fine from Locally. Any help would be appreciated?

I tried giving different version of the boto3, botocore and langchain but didn't work. Destroyed stack and recreated it also.

1

There are 1 best solutions below

0
On

While configuring the lambda layer, I noticed the absence of the compatible_architectures parameter. It seems that the container image was generated with the architecture of my local machine, causing a failure in the Lambda runtime architecture. Upon including the compatible architectures, specifically ARM_64 or X86_64, both configurations proved successful! Github link

langchain_lambda_layer = _alambda.PythonLayerVersion(self, 
           ‘langchain-lambda-layer’,
           entry = ‘./aws_bedrock_langchain_python_cdk/lambda/layer/langchain_latest/’,
           compatible_architectures=[_lambda.Architecture.ARM_64],
           compatible_runtimes=[_lambda.Runtime.PYTHON_3_11 ], )