I have a setup where I am using CodeCommit as my repository to store lambda functions and CodePipeline using AWS SAM to deploy and create lambda functions.
I would like to deploy the lambda functions into different environments such as QA, staging, and Prod. I have used the AWS Parameters store to reference my variables.
Below is my template.yaml file that I have set up that creates a lambda function and it uses AWS parameter store to reference the vairables
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Test
Parameters:
BucketName:
Description: 'Required. Bucket Name'
Type: 'AWS::SSM::Parameter::Value<String>'
Default: 'MyBucketname'
CSVPath:
Description: 'Required. Configkey Name'
Type: 'AWS::SSM::Parameter::Value<String>'
Default: 'MyCSVPath'
Resources:
GetOrdersFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./LambdaCode
Handler: app.lambda_handler
FunctionName: app
Runtime: python3.6
Description: 'staging'
Environment:
Variables:
BucketName: !Ref BucketName
CSVPath: !Ref CSVPath
Events:
HelloWorld:
Type: Api
Properties:
Path: /orders
Method: get
I am able to define variables in my template.yaml for deployment but I am not sure how I can define it for different environments (prod or qa).
When the pipeline triggers it should deploy to QA environment using QA variables and deploy to prod using prod variables which will be defined in AWS Parameter store
What changes should I make in my template.yaml file to enable deploying to different environments?
As Meir has mentioned, You can use parameters and condition functionality in cloudformation to do that, for example, you will add a parameter section as follow:
then a mapping section with a map to hold the environment variables for all your stages
then your function can use the variables based on which environment you are in:
Now when you are calling your sam to deploy command, you need to define which stage you are deploying to. ex:
Your complete cloudformation template should look like this: