Skip to content

Instantly share code, notes, and snippets.

@RobinBoers
Created August 2, 2023 19:25
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save RobinBoers/b2cc01eb71ec22e908b2b9069a9d7127 to your computer and use it in GitHub Desktop.
Save RobinBoers/b2cc01eb71ec22e908b2b9069a9d7127 to your computer and use it in GitHub Desktop.
Simple script to migrate the contents of an AWS S3 bucket to a bunny.net Storage Zone.
#!/bin/sh
# Script to copy files from GCS bucket to Bunny(.net) Storage.
# This script assumes you're logged into the `gcloud` cli, and
# that all the files are at the top-level of the bucket and/or zone.
# Usage:
# ./copy.sh <BUCKET> <ZONE> <PASSWORD> [FLAGS]
# Arguments:
# $1: S3 bucket ID
# The part after `s3://`
# Example: qdentity-demo
#
# $2: Bunny Storage Zone ID
# Example: qdentity-demo
#
# $3: Bunny Storage Zone password
# Can be found in Storage -> [Your zone] -> FTP & API Access -> Password -> Password
# Flags:
# -d: delete temporary directory
mkdir temp
cd temp
aws s3 sync s3://$1 ./
for file in *
do
if [[ $file != "copy.sh" ]]; then
echo "Uploading $file to https://storage.bunnycdn.com/$2/./$file"
curl --request PUT \
--url "https://storage.bunnycdn.com/$2/./$file" \
--header "AccessKey: $3" \
--header 'content-type: application/octet-stream' \
--data "@$file"
echo ""
fi
done
cd ..
while getopts d: flag
do
case "${flag}" in
d) rm -rf temp
esac
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment