Skip to content

Instantly share code, notes, and snippets.

@sjwaight
Last active December 27, 2018 01:22
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save sjwaight/773a3b831e571ce80f518059255a5435 to your computer and use it in GitHub Desktop.
Save sjwaight/773a3b831e571ce80f518059255a5435 to your computer and use it in GitHub Desktop.
Bash script to migrate static web site from AWS S3 to Azure Storage Static Websites
#!/bin/bash
resourcegroup=$1
demolocation=$2
storageacctname=$3
#####
# Setup AWS
#####
# Setup AWS environment
export AWS_ACCESS_KEY_ID=$4
export AWS_SECRET_ACCESS_KEY=$5
s3bucket=$6
# install AWS CLI just for this profile (global will fail in Cloud Shell)
pip install awscli --user
# reload to ensure we pick up changes to path that include 'aws' command
source .profile
#####
# Setup Azure
#####
# required December 2018 as static website feature only just GA.
az extension add --name storage-preview
# create resource group
az group create --name $resourcegroup --location $demolocation
# create storage account
az storage account create --resource-group $resourcegroup --name $storageacctname --location $demolocation --sku Standard_LRS --https-only true
# Convert storage into General Purpose V2 account which support static website feature
acctmetadata=$(az storage account update --resource-group $resourcegroup --name $storageacctname --set kind=StorageV2)
# Enable static website feature - default index and 404 doc to the same (ueful for Angular / SPA-style apps)
# retry following code as it can fail depending on time it takes to upgrade storage account to V2
n=0
until [ $n -ge 3 ]
do
az storage blob service-properties update --account-name $storageacctname --static-website --404-document index.html --index-document index.html && break
n=$[$n+1]
sleep 10
done
#####
# Copy content from AWS
#####
aws s3 cp s3://$s3bucket clouddrive/website/ --recursive
#####
# Copy content to Azure Storage
#####
az storage blob upload-batch -s 'clouddrive/website' --destination '$web' --account-name $storageacctname
# Print Azure Web URL for user
echo "Visit your website at the following URL (you can click it!):"
echo $acctmetadata | python3 -c "import sys, json; print(json.load(sys.stdin)['primaryEndpoints']['web'])"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment