Skip to content

Instantly share code, notes, and snippets.

@ritik-malik
Created April 3, 2024 09:03
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ritik-malik/46e97b32062c3d81481955109feccc48 to your computer and use it in GitHub Desktop.
Save ritik-malik/46e97b32062c3d81481955109feccc48 to your computer and use it in GitHub Desktop.
A simple bash script to copy and paste contents of a DynamoDB Table to another DynamoDB Table
#!/bin/bash
######################################
# This Script is used to copy the entire contents of the INPUT DynamoDB Table and paste them into the OUTPUT DynamoDB Table
# (Make sure that the 2nd table exist with the similar Partition key)
#
# It also works across different AWS Accounts, just make sure that you've the correct
# access permissions and paste the credentials one by one in the below fields
#
# DONOT change the variable names, just paste the access keys & tokens individually from the AWS console
# If you want to copy paste DDB Table in the same account, simply use the same keys in fields for 2nd account
#
# Credits: https://stackoverflow.com/a/76175013/14076245
######################################
###### SET THE GLOBAL VARS HERE ######
# AWS ACCOUNT 1 ACCESS KEYS & Table Name from which you want to copy:
AWS_ACCESS_KEY_ID_1="<ADD KEY HERE>"
AWS_SECRET_ACCESS_KEY_1="<ADD KEY HERE>"
AWS_SESSION_TOKEN_1="<ADD KEY HERE>"
INPUT_TABLE_NAME="<ADD TABLE NAME HERE>"
# AWS ACCOUNT 2 ACCESS KEYS & Table Name to which you want to paste:
AWS_ACCESS_KEY_ID_2="<ADD KEY HERE>"
AWS_SECRET_ACCESS_KEY_2="<ADD KEY HERE>"
AWS_SESSION_TOKEN_2="<ADD KEY HERE>"
OUTPUT_TABLE_NAME="<ADD TABLE NAME HERE>"
######################################
# Global Vars over, automation begins
# exit on error
set -eo pipefail
# Export AWS Keys for Account 1
export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID_1}
export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY_1}
export AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN_1}
# read from input table, modify the json and store locally
aws dynamodb scan --table-name ${INPUT_TABLE_NAME} --output json | jq "[ .Items[] | { PutRequest: { Item: . } } ]" > "${INPUT_TABLE_NAME}-dump.json"
echo "Downloaded input table values successfully..."
# get table size (used for batching)
table_size="$(cat "${INPUT_TABLE_NAME}-dump.json" | jq '. | length')"
echo "Table size = ${table_size} items"
echo -e "\nUploading the dump to the output table..."
# Export AWS Keys for Account 2
export AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID_2}
export AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY_2}
export AWS_SESSION_TOKEN=${AWS_SESSION_TOKEN_2}
# write to output table in batches of 25
for i in $(seq 0 25 ${table_size}); do
j=$(( i + 25 ))
cat "${INPUT_TABLE_NAME}-dump.json" | jq -c '{ "'${OUTPUT_TABLE_NAME}'": .['$i':'$j'] }' > "${OUTPUT_TABLE_NAME}-batch-payload.json"
echo "Loading records $i through $j (up to ${table_size}) into ${OUTPUT_TABLE_NAME}"
aws dynamodb batch-write-item --request-items file://"${OUTPUT_TABLE_NAME}-batch-payload.json"
rm "${OUTPUT_TABLE_NAME}-batch-payload.json"
done
echo -e "\nLoaded all records from ${INPUT_TABLE_NAME} to ${OUTPUT_TABLE_NAME} successfully..."
# clean up
rm "${INPUT_TABLE_NAME}-dump.json"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment