Migrating Amazon DynamoDB tables from one AWS account to another using simple Python script.

Nanthan Rasiah
3 min readJan 20, 2023

--

DynamoDB is a fully managed NoSQL database service that provides fast and predictable performance with seamless scalability. DynamoDB offers built-in security, continuous backups, automated multi-Region replication, in-memory caching, and data import and export tools. Hundreds of thousands of Amazon Web Services customers have chosen DynamoDB as their key-value and document database for mobile, web, gaming, ad tech, IoT, and other applications that need low-latency data access at any scale. AWS provides many tools to migrate DynamoDB tables from one AWS account to another.

AWS Backup, which is a fully managed data protection service that makes it easy to centralise and automate backups across AWS services, in the cloud, and on premises, allows to create cross-account DynamoDB backups and restore the backup from the DynamoDB console.

You can migrate the DynamoDB table data by exporting table to an Amazon S3 bucket in any AWS Region and owned by any account with write permissions and then importing data from the S3 bucket into the new table in the other account.

After exporting the DynamoDB table data into S3 bucket, you can also use a AWS Glue job to read the files from S3 bucket and write to the target DynamoDB table.

AWS Data Pipeline, a web service that helps you reliably process and move data between different AWS compute and storage services, as well as on-premises data sources, can be used to export DynamoDB table data to a file in an Amazon S3 bucket and import the exported data from the S3 bucket into the new table in the other account.

You also can use Amazon EMR to export the data stored in DynamoDB to Amazon S3 and import data stored in Amazon S3 to DynamoDB.

But a simple way to migrate DynamoDB table data is to use scripts created using AWS Boto3 APIs. It is quite handy.

Recently, I was creating AWS Batch service to extract data from ServiceNow tables into Amazon S3 bucket. The service stores data extraction configuration for each ServiceNow table in DynamoDB table as shown below.

We have seperate AWS account for each environment, development, test and prod, automated CI/CD pipeline for service deployment and CloudFormation for DynamoDB table creation. But migrating DynamoDB table data is bit challenging. Though we had above mentioned migration tools. We were after a simple migration tool which can be easily hooked into CI/CD pipeline to migrate data automatically from test to prod automatically.

Come up with the following Python script to copy the DynamoDB table data across AWS accounts.

You just have to pass source AWS account profile, target AWS account profile, source table name and target table name as argument. Assume you have set up AWS account profiles with right access keys.

import argparse
import boto3 as boto3


def copy_item(source_account_profile, target_account_profile, source_table_name, target_table_name):
source_account_session = boto3.Session(profile_name=source_account_profile)
source_account_dynamodb = source_account_session.client('dynamodb')

target_account_session = boto3.Session(profile_name=target_account_profile)
target_account_dynamodb = target_account_session.client('dynamodb')

dynamo_paginator = source_account_dynamodb.get_paginator('scan')
dynamo_response = dynamo_paginator.paginate(
TableName=source_table_name,
Select='ALL_ATTRIBUTES',
ReturnConsumedCapacity='NONE',
ConsistentRead=True
)

for page in dynamo_response:
for item in page['Items']:
target_account_dynamodb.put_item(
TableName=target_table_name,
Item=item
)


parser = argparse.ArgumentParser()

parser.add_argument("source_account_profile", help="Source Account Profile")
parser.add_argument("target_account_profile", help="Target Account Profile")
parser.add_argument("source_table_name", help="Source Table Name")
parser.add_argument("target_table_name", help="Target Table Name")

args = vars(parser.parse_args())

copy_item(args['source_account_profile'], args['target_account_profile'], args['source_table_name'],
args['target_table_name'])

You can execute the above script on demand or via CI/CD pipeline as follows:

python copy_dynamodb_items.py <source_account_profile> <target_account_profile> <source_table_name> <target_table_name>

Hope you also will have similar uses cases in your business and it will be useful.

--

--

Nanthan Rasiah
Nanthan Rasiah

Written by Nanthan Rasiah

Ex. AWS APN Ambassador | Architect | AWS Certified Pro | GCP Certified Pro | Azure Certified Expert | AWS Certified Security & Machine Learning Specialty

Responses (1)