Automating AWS CloudFormation Stack create/update using Bitbucket Pipeline.
Continuous Integration and Continuous Delivery (CI/CD) is one of the best practice in DevOps space to deliver code changes frequently and reliably. This post is to explain how to set up bitbucket CI/CD pipeline to deliver infrastructure code (AWS CloudFormation) changes, in DEV, TEST and PROD AWS environment consistently and repeatedly.
Cloud native applications deployed in AWS is configured to use cloud resource such as SSM Parameter Store, SNS, SQS, S3, EFS, RDS, DynamoDB etc for configuration management, integration and storage. Hence, application source code changes often results in infrastructure code changes which needs to be managed properly.
Bitbucket(https://bitbucket.org/) is widely used git based code hosting and collaboration tool. Organisation hosts both application source code and infrastructure code in bitbucket repositories. Bitbucket provides CI/CD pipelines which allows to promote both infrastructure code and source code changes to different environments with ease. This post focus on how to promote infrastructure code changes to AWS cloud. AWS provides CloudFormation, a powerful Infrastructure as Code tool to provision AWS resources.
Let’s consider a scenario where development team creates and maintains a components, test-adapter-service which needs to be deployed in serverless computing engine, AWS ECS Fargate and service configurations need to be stored in SSM parameter store. DevOps team creates and maintains CloudFormation template (test-adapter-service.yml) to deploy service in AWS Fargate and to create/update configuration in SSM parameter store. This test-adapter-service changes frequently and the changes need to be delivered automatically to DEV, TEST, PROD AWS accounts. All the codes are hosted in bitbucket.
The best solutions for the above case is to set up bitbucket pipeline to deliver CloudFormation template changes(test-adapter-service.yml) consistently across all the environments. Bitbucket provides cloud formation deployment plugin (https://bitbucket.org/atlassian/aws-cloudformation-deploy) for this.
CloudFormation can be customised using parameters so that the same template can be reused across all the environment and as a best practice, it is recommended to create separate parameter files for each environment. In this case, we need to create parameter files as below for CloudFormation template, test-adapter-service.yml.
Parameter file contains key value data in json format. Sensitive values can be stored in bitbucket repository variable and referenced it in parameter file.
Here is the sample parameter file for DEV environment. ClientIdValue_dev and ClientSecretValue_dev are defined as bitbucket repository variable.
Here is the bitbucket pipeline configuration to deliver CloudFormation template changes(test-adapter-service.yml) to DEV, TEST and PROD environment. This snippet needs to be added to bitbucket-pipelines.yml.
- It contains 3 steps. First one is to deploy to DEV environment automatically, second one is to TEST environment using manual tigger and third one is to PROD environment using manual tigger.
- You can add the above to bitbucket pipeline and configure it to execute on release branch automatically or manually.
- First parameter file is read into environment variable [export PARAMETERS=$(cat ./parameters_dev.json)] to set STACK_PARAMETERS variable which supports only JSON String type.
- AWS access key and secret key values are populated from bitbucket repository variable.
- WAIT: ‘true’ set to wait until AWS resources defined in CloudFormation are successfully created/updated.
- If you include resources in CloudFormation that can affect permissions in your AWS account, right values for CAPABILITIES needs to be set.
Here is the sample CloudFormation template to deploy test-adapter-service into ECS Fargate.
This post provides best practices, configuration steps and samples to deploy AWS CloudFormation changes across different environments using bitbucket pipeline.