Skip to content

Instantly share code, notes, and snippets.

@chrismdp
Last active March 5, 2024 12:57
Show Gist options
  • Save chrismdp/6c6b6c825b07f680e710 to your computer and use it in GitHub Desktop.
Save chrismdp/6c6b6c825b07f680e710 to your computer and use it in GitHub Desktop.
Uploading to S3 in 18 lines of Shell (used to upload builds for http://soltrader.net)
# You don't need Fog in Ruby or some other library to upload to S3 -- shell works perfectly fine
# This is how I upload my new Sol Trader builds (http://soltrader.net)
# Based on a modified script from here: http://tmont.com/blargh/2014/1/uploading-to-s3-in-bash
S3KEY="my aws key"
S3SECRET="my aws secret" # pass these in
function putS3
{
path=$1
file=$2
aws_path=$3
bucket='my-aws-bucket'
date=$(date +"%a, %d %b %Y %T %z")
acl="x-amz-acl:public-read"
content_type='application/x-compressed-tar'
string="PUT\n\n$content_type\n$date\n$acl\n/$bucket$aws_path$file"
signature=$(echo -en "${string}" | openssl sha1 -hmac "${S3SECRET}" -binary | base64)
curl -X PUT -T "$path/$file" \
-H "Host: $bucket.s3.amazonaws.com" \
-H "Date: $date" \
-H "Content-Type: $content_type" \
-H "$acl" \
-H "Authorization: AWS ${S3KEY}:$signature" \
"https://$bucket.s3.amazonaws.com$aws_path$file"
}
for file in "$path"/*; do
putS3 "$path" "${file##*/}" "/path/on/s3/to/files/"
done
@Boomser13
Copy link

Hi chrismdp,

I ran the same script in my environment it is giving out an error.
SignatureDoesNotMatchThe request signature we calculated does not match the signature you provided. Check your key and signing method.AKI*****************LQPUT

Can you please help me with that..
I am also trying to create a download shell script as well if you have any information regarding that do let me know.

Thanks In Advance.
:) 👍

@oystersauce8
Copy link

bro @Boomser13, I was in the same boat. I was missing "#!/bin/bash" at the top of the script, it started working when I added it.
(sh and bash are different it seems, and sometimes when not specified it defaults to sh)

@nmcgann
Copy link

nmcgann commented Apr 2, 2017

Updated to handle region codes and storage classes (acl and content type could also be parameterised):

#S3 parameters
S3KEY="my-key"
S3SECRET="my-secret"
S3BUCKET="my-bucket"
S3STORAGETYPE="STANDARD" #REDUCED_REDUNDANCY or STANDARD etc.
AWSREGION="s3-xxxxxx"

function putS3
{
  path=$1
  file=$2
  aws_path=$3
  bucket="${S3BUCKET}"
  date=$(date +"%a, %d %b %Y %T %z")
  acl="x-amz-acl:private"
  content_type="application/octet-stream"
  storage_type="x-amz-storage-class:${S3STORAGETYPE}"
  string="PUT\n\n$content_type\n$date\n$acl\n$storage_type\n/$bucket$aws_path$file"
  signature=$(echo -en "${string}" | openssl sha1 -hmac "${S3SECRET}" -binary | base64)
  curl -s -X PUT -T "$path/$file" \
    -H "Host: $bucket.${AWSREGION}.amazonaws.com" \
    -H "Date: $date" \
    -H "Content-Type: $content_type" \
    -H "$storage_type" \
    -H "$acl" \
    -H "Authorization: AWS ${S3KEY}:$signature" \
    "https://$bucket.${AWSREGION}.amazonaws.com$aws_path$file"
}

@gonzalo-trenco
Copy link

Extending @ORANGE-XFM response and following this docs
If executing this script from MacOS bash you need to set date like this
date=$(TZ=utc date +"%Y%m%dT%H%M%SZ")
in order to get rid of the bad date format errors

@miznokruge
Copy link

miznokruge commented Jun 22, 2017

Hi.
I use your script, but got an error message :

The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.

Any of you guys got message like this?

cc: @chrismdp

@rosencreuz
Copy link

There's a problem with the locale. It works fine with en_US.utf8 locale but doesn't work with other languages because of the date string. I think it's better if there's an LC_TIME=en_US.utf8 somewhere in the script.

@aguero700
Copy link

is there anyone that knows where i can get shells ?

@i1skn
Copy link

i1skn commented Jul 26, 2017

@miznokruge I guess you are using one of the new regions after January 30, 2014. From http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html

This latest signature version is supported in all regions and any new regions after January 30, 2014 will support only Signature Version 4. For more information, go to Authenticating Requests (AWS Signature Version 4) in the Amazon Simple Storage Service API Reference.

This script using V2 API version, so to use new regions after January 30, 2014 you will need support v4 standard.

@liberodark
Copy link

How to configure this with wasabi ?

@ChestersGarage
Copy link

Stoked to come across this. We are using it to upload files from AIX servers, which doesn't support (read: admins won't install) the AWS cli and related stuff. Thanks!

@pkhetan
Copy link

pkhetan commented Sep 24, 2018

My company is using rook for s3 storage. So I have different base URL for my bucket. "http://abc.rook.com" like this.
so if I create a new bucket with let say "piyush" and put some object inside "test" then my URL becomes like this "http://abc.rook.com/piyush/test/abc.txt". I am able to put object using s3api but not with above curl script.
Please suggest what change I need to do.

@atonamy
Copy link

atonamy commented Aug 21, 2020

Hi chrismdp,

I ran the same script in my environment it is giving out an error.
SignatureDoesNotMatchThe request signature we calculated does not match the signature you provided. Check your key and signing method.AKI*****************LQPUT

Can you please help me with that..
I am also trying to create a download shell script as well if you have any information regarding that do let me know.

Thanks In Advance.
:) 👍

I have the same problem

The request signature we calculated does not match the signature you provided

How to fix that?

@micaelomota
Copy link

micaelomota commented Sep 14, 2020

For those with root access, just install awscli and be happy doing that in only one line https://aws.amazon.com/getting-started/hands-on/backup-to-s3-cli/

@sanlodhi
Copy link

How can I test my connection to S3 bucket

@sanlodhi
Copy link

sanlodhi commented Oct 29, 2020

Hi bro ,may you please help with.....

How do I check my connection to S3 is successful or not?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment