Create a gist now

Instantly share code, notes, and snippets.

What would you like to do?
Copy between S3 buckets w/ different accounts

This is a mix between two sources:

basically the first resource is great but didn't work for me: I had to remove the trailing "/*" in the resource string to make it work. I also noticed that setting the policy on the source bucket was sufficient. In the end these are the exact steps I followed to copy data between two buckets on two accounts

Basically the idea there is:

  • we allowe the destination account to read the source bucket (in the console for the source account)
  • we log as the destination and start the copy

Step 1 grab the user name for the destination account

log into AWS with the destination account and go to "My Account" https://portal.aws.amazon.com/gp/aws/manageYourAccount The account number is on the top right below the search bar (under "Welcome XXX") and is like 1234-1234-1234 (12 digits)

For the rest I also assume you have and API key/password, if not:

  • go to the the console https://console.aws.amazon.com

  • click on your name on the top right > Security Credentials

  • Expand "Access Keys" and click on "Create New Access Key" You then obtain a file that looks like that:

    AWSAccessKeyId=AAAAAAAAAA AWSSecretKey=abababababababababababaabbabab

The first value (AAAAAAAAAA) is the API key, the second (abababababababababababaabbabab) is the password.

Step 2 create the policy for the source bucket

log into AWS with the source account and go to the AWS console for S3

select your bucket > Properties (on the right) > Permissions > Edit bucket policy You then see a dialog named "Bucket Policy Editor"

on the bottom left of the dialog select "AWS policy generator". It will open a new page with a form, set the following values:

  • Select Type of Policy: S3 Bucket Policy
  • Effect: Allow
  • Principal: arn:aws:iam::123412341234:root (123412341234 is the destination account number without the dashes)
  • AWS Service: Amazon S3
  • Actions: click "All Actions"
  • Amazon Resource Name: arn:aws:s3:::source-bucket (replace "source-bucket" with your source bucket name)

The click "Add Statement" and then "Generate Policy" You then see a dialog with contents similar to:

{
  "Id": "Policy1383062241257",
  "Statement": [
    {
      "Sid": "Stmt1383062239775",
      "Action": "s3:*",
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::source-bucket",
      "Principal": {
        "AWS": [
          "arn:aws:iam::123412341234:root"
        ]
      }
    }
  ]
}

cut and paste the policy in the dialog of the previous page (the "Bucket Policy Editor") and click "Save"

Step 3 copy using s3cmd

Install s3cmd, on the Mac:

brew install s3cmd

then configure your credentials for the destination account:

s3cmd --configure

It will ask for your API key and corresponding password, then a password to encode your credentials. Andswer yes (y) to test the connection and save the configuration.

now you can copy:

s3cmd sync --skip-existing --recursive s3://source-bucket s3://destination-bucket

Thank you very much for this. Saved me a ton of time today.

Kalagan commented May 15, 2015

To make it work I had to also give access to source-bucket/*
Like this :
"Resource": [
"arn:aws:s3:::source-bucket",
"arn:aws:s3:::source-bucket/*"
]

tboyko commented Sep 16, 2015

+1 the comment by @Kalagan

Really quick & helpful. Thanks.

aws cli works well for me ,I had 5TB data to be copied

bhgames commented Apr 4, 2016

Still getting 403s with this.

johnt337 commented May 5, 2016

+1 @Kalagan regarding their comment. Bucket logs still tend to throw errors and I had to grant Action: "*".
Ultimately had better luck with aws s3 sync. Great gist thanks!

lao commented Jun 17, 2016

+1 the comment by @Kalagan

+1 the comment by @Kalagan

kheengz commented Mar 9, 2017

works for me...thanks guys

for Ubuntu user :
apt-get install s3cmd
s3cmd --configure
s3cmd cp s3://sourcebucket/ s3://destinationbucket/

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

I was having issues with access denied even with Kalagan's solution, ended up just using aws s3 sync s3://sourcebucket/ s3://destinationbucket/ --profile destinationAccountProfile

aioue commented Sep 6, 2017

what's an example dest or source bucket? Do I need the full ARN?

For smaller buckets, it may be easier to do two aws s3 sync commands.

aws configure (log into source bucket)
aws s3 sync s3://sourcebucket/ ~/local/path
aws configure (log into destination bucket)
aws s3 sync ~/local/path s3://destinationbucket/

I haven't tested this, but in theory something like this should work. This is not meant for extremely large bucket transfers - the direct method above is preferred since the data will transfer inside Amazon's cloud and will be crazy fast that way.

nikolaiderzhak commented Nov 23, 2017

Nowadays it seems better to stick with aws cli.

aws s3 sync s3://sourcebucket s3://destinationbucket

Also there is official doc on whole procedure: https://aws.amazon.com/premiumsupport/knowledge-center/account-transfer-s3/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment