Skip to content

Instantly share code, notes, and snippets.

@ushu
Created October 29, 2013 16:12
Show Gist options
  • Star 81 You must be signed in to star a gist
  • Fork 14 You must be signed in to fork a gist
  • Save ushu/7217693 to your computer and use it in GitHub Desktop.
Save ushu/7217693 to your computer and use it in GitHub Desktop.
Copy between S3 buckets w/ different accounts

This is a mix between two sources:

basically the first resource is great but didn't work for me: I had to remove the trailing "/*" in the resource string to make it work. I also noticed that setting the policy on the source bucket was sufficient. In the end these are the exact steps I followed to copy data between two buckets on two accounts

Basically the idea there is:

  • we allowe the destination account to read the source bucket (in the console for the source account)
  • we log as the destination and start the copy

Step 1 grab the user name for the destination account

log into AWS with the destination account and go to "My Account" https://portal.aws.amazon.com/gp/aws/manageYourAccount The account number is on the top right below the search bar (under "Welcome XXX") and is like 1234-1234-1234 (12 digits)

For the rest I also assume you have and API key/password, if not:

  • go to the the console https://console.aws.amazon.com

  • click on your name on the top right > Security Credentials

  • Expand "Access Keys" and click on "Create New Access Key" You then obtain a file that looks like that:

    AWSAccessKeyId=AAAAAAAAAA AWSSecretKey=abababababababababababaabbabab

The first value (AAAAAAAAAA) is the API key, the second (abababababababababababaabbabab) is the password.

Step 2 create the policy for the source bucket

log into AWS with the source account and go to the AWS console for S3

select your bucket > Properties (on the right) > Permissions > Edit bucket policy You then see a dialog named "Bucket Policy Editor"

on the bottom left of the dialog select "AWS policy generator". It will open a new page with a form, set the following values:

  • Select Type of Policy: S3 Bucket Policy
  • Effect: Allow
  • Principal: arn:aws:iam::123412341234:root (123412341234 is the destination account number without the dashes)
  • AWS Service: Amazon S3
  • Actions: click "All Actions"
  • Amazon Resource Name: arn:aws:s3:::source-bucket (replace "source-bucket" with your source bucket name)

The click "Add Statement" and then "Generate Policy" You then see a dialog with contents similar to:

{
  "Id": "Policy1383062241257",
  "Statement": [
    {
      "Sid": "Stmt1383062239775",
      "Action": "s3:*",
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::source-bucket",
      "Principal": {
        "AWS": [
          "arn:aws:iam::123412341234:root"
        ]
      }
    }
  ]
}

cut and paste the policy in the dialog of the previous page (the "Bucket Policy Editor") and click "Save"

Step 3 copy using s3cmd

Install s3cmd, on the Mac:

brew install s3cmd

then configure your credentials for the destination account:

s3cmd --configure

It will ask for your API key and corresponding password, then a password to encode your credentials. Andswer yes (y) to test the connection and save the configuration.

now you can copy:

s3cmd sync --skip-existing --recursive s3://source-bucket s3://destination-bucket
@erickrause
Copy link

Thank you very much for this. Saved me a ton of time today.

@JulienSansot
Copy link

To make it work I had to also give access to source-bucket/*
Like this :
"Resource": [
"arn:aws:s3:::source-bucket",
"arn:aws:s3:::source-bucket/*"
]

@tboyko
Copy link

tboyko commented Sep 16, 2015

+1 the comment by @Kalagan

@iamalpani
Copy link

Really quick & helpful. Thanks.

@ssharmaolx
Copy link

aws cli works well for me ,I had 5TB data to be copied

@bhgames
Copy link

bhgames commented Apr 4, 2016

Still getting 403s with this.

Copy link

ghost commented May 5, 2016

+1 @Kalagan regarding their comment. Bucket logs still tend to throw errors and I had to grant Action: "*".
Ultimately had better luck with aws s3 sync. Great gist thanks!

@lao
Copy link

lao commented Jun 17, 2016

+1 the comment by @Kalagan

@Laxman-SM
Copy link

+1 the comment by @Kalagan

@wenhongqiang
Copy link

@kheengz
Copy link

kheengz commented Mar 9, 2017

works for me...thanks guys

@zahidulislam012
Copy link

for Ubuntu user :
apt-get install s3cmd
s3cmd --configure
s3cmd cp s3://sourcebucket/ s3://destinationbucket/

@kostyaev
Copy link

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

@AlexThomas90210
Copy link

I was having issues with access denied even with Kalagan's solution, ended up just using aws s3 sync s3://sourcebucket/ s3://destinationbucket/ --profile destinationAccountProfile

@aioue
Copy link

aioue commented Sep 6, 2017

what's an example dest or source bucket? Do I need the full ARN?

@danielricecodes
Copy link

For smaller buckets, it may be easier to do two aws s3 sync commands.

aws configure (log into source bucket)
aws s3 sync s3://sourcebucket/ ~/local/path
aws configure (log into destination bucket)
aws s3 sync ~/local/path s3://destinationbucket/

I haven't tested this, but in theory something like this should work. This is not meant for extremely large bucket transfers - the direct method above is preferred since the data will transfer inside Amazon's cloud and will be crazy fast that way.

@nikolaiderzhak
Copy link

nikolaiderzhak commented Nov 23, 2017

Nowadays it seems better to stick with aws cli.

aws s3 sync s3://sourcebucket s3://destinationbucket

Also there is official doc on whole procedure: https://aws.amazon.com/premiumsupport/knowledge-center/account-transfer-s3/

@webjay
Copy link

webjay commented Mar 7, 2018

These are the actions needed for sync, like: aws --profile PROFILENEW s3 sync s3://BUCKETOLD s3://BUCKETNEW

{
    "Version": "2012-10-17",
    "Id": "Policy1520420757463",
    "Statement": [
        {
            "Sid": "Stmt1520420753833",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::XXX:root"
            },
            "Action": [
                "s3:ListBucket",
                "s3:GetObject",
                "s3:GetObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETOLD",
                "arn:aws:s3:::BUCKETOLD/*"
            ]
        }
    ]
}

@pidugusundeep
Copy link

iam still getting a 403 error access denied how do i sort this out ?

@harish6123
Copy link

i am getting error Invalid principal in policy,can you please suggest me on this.

@webjay
Copy link

webjay commented Aug 20, 2018

Make sure the Principal is the user running the operation.

In my case the user is from the remote account:

            "Principal": {
                "AWS": "arn:aws:iam::123412341234:user/myusername"
            },

@cecchisandrone
Copy link

Better to use aws cli

@javiermurillo
Copy link

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

Thank you @kostyaev, you save me a lot of time!

@pratapatl
Copy link

Can anyone share the expected throughput copying from S3 to S3 between two account in the same region?

@Sober-bug
Copy link

Hi, if I want to run the sync command in source account. What should I do?

@ismailyenigul
Copy link

Hi @Sober-bug

  1. Create a bucket policy at source bucket Permissions tab
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1383062239775",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::AWS-ID-of-destination-aws-account:root"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::my-source-bucket",
                "arn:aws:s3:::my-source-bucket/*"
            ]
        }
    ]
}

AWS-ID-of-destination-aws-account is 12 digits number which you can see it on bottom of the AWS IAM console left menu.

Create an AWS user with a policy has s3 write permission in your destination AWS account. Get it's aws key id and secret.
then configure it with aws configure
then start sync
aws s3 sync s3://source-bucket/ s3://destination-bucket

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment