Skip to content

Instantly share code, notes, and snippets.

@developerdino
Last active June 1, 2022 13:20
Show Gist options
  • Save developerdino/438097c4167cc063c3f8edfe1754408f to your computer and use it in GitHub Desktop.
Save developerdino/438097c4167cc063c3f8edfe1754408f to your computer and use it in GitHub Desktop.
Upload lots of files to S3 bucket without hitting cli limits

Shell script for uploading files to S3 bucket

Got a lot of files to upload to an S3 bucket? This little script will do it one file at a time without hitting command line limits, etc.

Usage

./upload-files-to-s3.sh <bucket-name> <directory-with-files-to-upload> <aws-profile>

<aws-profile> is optional and defaults to default

Requirements

  • AWS profile set up on system running script
  • aws command line tools
  • openssl to calculate md5 value for s3 upload verification
#!/usr/bin/bash
BUCKET_NAME=$1
DIRECTORY_TO_UPLOAD=$2
AWS_PROFILE=${3:-default}
function upload_to_s3() {
FILE=$1
S3_FILE=$(echo "$FILE" |cut -c 3-)
CHECKSUM=$(openssl md5 -binary "$FILE" |base64)
aws --profile=$AWS_PROFILE s3 cp "$FILE" "s3://$BUCKET_NAME/$S3_FILE" --metadata md5="$CHECKSUM"
}
cd $DIRECTORY_TO_UPLOAD
export -f upload_to_s3
find . -type f -exec bash -c "upload_to_s3 \"{}\"" \;
@developerdino
Copy link
Author

Or you can just use the recursive option of the aws cli cp command like

aws s3 cp . s3://$BUCKET_NAME/ --recursive

which is considerably faster than my script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment