Create a gist now

Instantly share code, notes, and snippets.

What would you like to do?
Command line utility to sync the current directory with an S3 bucket, compressing and metadata'ing files as we go...
#!/bin/bash
BUCKET=$1
TMP_DIR=`mktemp -d -t gzsync`
# check that we have a trailing slash
[[ $BUCKET != */ ]] && BUCKET="$BUCKET"/
printf "Copying files to temporary directory... "
cp -R . $TMP_DIR && cd $TMP_DIR
printf "Done\n"
PWD=`pwd`
if [ "$PWD" = "$TMP_DIR" ]
then
for filename in $(find . -type f | sed "s/\.\///g" | grep -v git | grep -v .DS_Store)
do
printf "${filename}... "
gzip -t ${filename} > /dev/null 2>&1 || { gzip -9 ${filename}; mv ${filename}.gz ${filename}; }
s3cmd put ${filename} ${BUCKET}${filename} --add-header "Content-Encoding: gzip" --acl-public > /dev/null 2>&1
printf "Done\n"
done
printf "Cleaning up... "
cd && rm -rf ${TMP_DIR}
printf "Done\n"
else
echo "ERROR: Failed to change to current directory, exiting" 1>&2
fi
@braxtone

This is really slick! It was a huge help in getting compression going on my Jekyll site. Thanks!

I posted a forked version where I added the ability to exclude files from compression based on a file extension: https://gist.github.com/braxtone/43405ecdfc86f4ed4e6c

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment