Serverless Batch Computing with AWS Batch and AWS Fargate.

Nanthan Rasiah
3 min readMay 23, 2021

AWS Batch allows to run batch computing workloads on the AWS cloud across Amazon EC2, AWS Fargate and Spot instances. It is a fully managed service and ease the burden of managing and provisioning complex batch environment. AWS Fargate is a serverless computing environment for containers. AWS Batch on AWS Fargate brings the luxury of running batch workloads on fully managed container environment. The purpose of this post is to explain how to automate the provision of AWS Batch environment on AWS Fargate using AWS Cloud​Formation and the continuous deployment of service to AWS Batch environment using bitbucket pipeline.

Let’s assume your company is in the process of building machine learning model to detect cryptocurrency fraud and company receives nightly cryptocurrency fraudulent data in a number of formats which needs to be cleaned and transformed to an appropriate format to be used in model training and testing. You are tasked with creating python microservice to run nightly to clean and transform fraudulent data. This service needs to be deployed on AWS Batch and scheduled to run transformation for each data type at different time.

As the above problem requires to transform different data types, it is best practice to create python script for each transformation. First we need to create microservice that should contain transformation logics and scripts. Details of the transformation logic and script are out of scope in this post. Here we focus mainly on building docker image and service deployment.

Let’s build the docker image. We can use the following docker file to create image. Here, ENTRYPOINT is defined as python3 and default command is version. When you start the container, it will print the python version first.

#Python runtime. Here we are using the python image provided by AWS #for lambda
FROM amazon/aws-lambda-python
#Dependent libs are defined in requirements.txt
COPY requirements.txt /tmp/
#Installation of all the dependencies defined in the file.
RUN pip3 install --no-cache-dir -r /tmp/requirements.txt
#Changing the working dir to /app
WORKDIR /app
#Copying all the files into /app directory.
COPY . /app
#Defining entry point as python3
ENTRYPOINT ["python3"]
#Defining default command
CMD ["--version"]

Now docker image is ready. Let’s see how to automate the provision of AWS Batch resources to run the container from the above image on AWS Fargate using AWS CloudFormation.

Firstly, we need to create compute environment using the following AWS CloudFormation.

Secondly we need to create job queue as below.

Finally we need to create job definition as below. Let’s assume, the above service has two transformation scripts, transformA.py and transformB.py. In this case, we need to create two job descriptions. One is for script transformA.py and the other is for transformB.py.

Here, job role and execution role should be created with appropriate permissions.

Next step is to schedule the jobs to run at a particular time. For this, we can use Amazon CloudWatch or Amazon EventBridge Event Rule as shown below.

Here, you are creating appropriate event rule role to submit the job.

I have provided complete AWS CloudFormation template below which defines all the resources needed to create AWS Batch environment for the container deployment on AWS Fargate. You can customise it as per your need.

So far, we have looked at how to create AWS resources and container using AWS CloudFormation.

Now let’s see how to automate service deployment using bitbucket pipeline. As service will undergo changes often, best approach is to automate the service deployment.

Here is the sample bitbucket pipeline template and script to deploy service changes continuously.

First step in this pipeline is creating docket image and publishing it to Amazon ECR. Second step is deploying the image into AWS Batch environment using script, deploy.sh.

All the variables used in the pipeline and script should be defined in bitbucket repository variables.

This deploy script (deploy.sh) reads the existing stack and update the same stack with new docker images name. IAM user will assume the role defined in ROLE_ARN.

Here is the sample deploy script, deploy.sh. This script should be included as part of the project.

This post covered how to automate AWS Fargate container deployment on AWS Batch environment using AWS CloudFormation and how to deploy service changes continuously with bitbucket pipeline. I have provided sample AWS CloudFormation templates and script and you can modify it as per your need. Hope this post will be useful and will assist in setting up AWS Batch on AWS Fargate environment.

--

--

Nanthan Rasiah

Ex. AWS APN Ambassador | Architect | AWS Certified Pro | GCP Certified Pro | Azure Certified Expert | AWS Certified Security & Machine Learning Specialty