Create a gist now

Instantly share code, notes, and snippets.

What would you like to do?
Copy between S3 buckets w/ different accounts

This is a mix between two sources:

basically the first resource is great but didn't work for me: I had to remove the trailing "/*" in the resource string to make it work. I also noticed that setting the policy on the source bucket was sufficient. In the end these are the exact steps I followed to copy data between two buckets on two accounts

Basically the idea there is:

  • we allowe the destination account to read the source bucket (in the console for the source account)
  • we log as the destination and start the copy

Step 1 grab the user name for the destination account

log into AWS with the destination account and go to "My Account" https://portal.aws.amazon.com/gp/aws/manageYourAccount The account number is on the top right below the search bar (under "Welcome XXX") and is like 1234-1234-1234 (12 digits)

For the rest I also assume you have and API key/password, if not:

  • go to the the console https://console.aws.amazon.com

  • click on your name on the top right > Security Credentials

  • Expand "Access Keys" and click on "Create New Access Key" You then obtain a file that looks like that:

    AWSAccessKeyId=AAAAAAAAAA AWSSecretKey=abababababababababababaabbabab

The first value (AAAAAAAAAA) is the API key, the second (abababababababababababaabbabab) is the password.

Step 2 create the policy for the source bucket

log into AWS with the source account and go to the AWS console for S3

select your bucket > Properties (on the right) > Permissions > Edit bucket policy You then see a dialog named "Bucket Policy Editor"

on the bottom left of the dialog select "AWS policy generator". It will open a new page with a form, set the following values:

  • Select Type of Policy: S3 Bucket Policy
  • Effect: Allow
  • Principal: arn:aws:iam::123412341234:root (123412341234 is the destination account number without the dashes)
  • AWS Service: Amazon S3
  • Actions: click "All Actions"
  • Amazon Resource Name: arn:aws:s3:::source-bucket (replace "source-bucket" with your source bucket name)

The click "Add Statement" and then "Generate Policy" You then see a dialog with contents similar to:

{
  "Id": "Policy1383062241257",
  "Statement": [
    {
      "Sid": "Stmt1383062239775",
      "Action": "s3:*",
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::source-bucket",
      "Principal": {
        "AWS": [
          "arn:aws:iam::123412341234:root"
        ]
      }
    }
  ]
}

cut and paste the policy in the dialog of the previous page (the "Bucket Policy Editor") and click "Save"

Step 3 copy using s3cmd

Install s3cmd, on the Mac:

brew install s3cmd

then configure your credentials for the destination account:

s3cmd --configure

It will ask for your API key and corresponding password, then a password to encode your credentials. Andswer yes (y) to test the connection and save the configuration.

now you can copy:

s3cmd sync --skip-existing --recursive s3://source-bucket s3://destination-bucket
@erickrause

This comment has been minimized.

Show comment Hide comment
@erickrause

erickrause May 12, 2015

Thank you very much for this. Saved me a ton of time today.

Thank you very much for this. Saved me a ton of time today.

@Kalagan

This comment has been minimized.

Show comment Hide comment
@Kalagan

Kalagan May 15, 2015

To make it work I had to also give access to source-bucket/*
Like this :
"Resource": [
"arn:aws:s3:::source-bucket",
"arn:aws:s3:::source-bucket/*"
]

Kalagan commented May 15, 2015

To make it work I had to also give access to source-bucket/*
Like this :
"Resource": [
"arn:aws:s3:::source-bucket",
"arn:aws:s3:::source-bucket/*"
]

@tboyko

This comment has been minimized.

Show comment Hide comment
@tboyko

tboyko Sep 16, 2015

+1 the comment by @Kalagan

tboyko commented Sep 16, 2015

+1 the comment by @Kalagan

@iamalpani

This comment has been minimized.

Show comment Hide comment
@iamalpani

iamalpani Feb 20, 2016

Really quick & helpful. Thanks.

Really quick & helpful. Thanks.

@ssharmaolx

This comment has been minimized.

Show comment Hide comment
@ssharmaolx

ssharmaolx Mar 11, 2016

aws cli works well for me ,I had 5TB data to be copied

aws cli works well for me ,I had 5TB data to be copied

@bhgames

This comment has been minimized.

Show comment Hide comment
@bhgames

bhgames Apr 4, 2016

Still getting 403s with this.

bhgames commented Apr 4, 2016

Still getting 403s with this.

@johnt337

This comment has been minimized.

Show comment Hide comment
@johnt337

johnt337 May 5, 2016

+1 @Kalagan regarding their comment. Bucket logs still tend to throw errors and I had to grant Action: "*".
Ultimately had better luck with aws s3 sync. Great gist thanks!

johnt337 commented May 5, 2016

+1 @Kalagan regarding their comment. Bucket logs still tend to throw errors and I had to grant Action: "*".
Ultimately had better luck with aws s3 sync. Great gist thanks!

@lao

This comment has been minimized.

Show comment Hide comment
@lao

lao Jun 17, 2016

+1 the comment by @Kalagan

lao commented Jun 17, 2016

+1 the comment by @Kalagan

@Laxman-SM

This comment has been minimized.

Show comment Hide comment
@Laxman-SM

Laxman-SM Aug 26, 2016

+1 the comment by @Kalagan

+1 the comment by @Kalagan

@kheengz

This comment has been minimized.

Show comment Hide comment
@kheengz

kheengz Mar 9, 2017

works for me...thanks guys

kheengz commented Mar 9, 2017

works for me...thanks guys

@zahidulislam012

This comment has been minimized.

Show comment Hide comment
@zahidulislam012

zahidulislam012 May 29, 2017

for Ubuntu user :
apt-get install s3cmd
s3cmd --configure
s3cmd cp s3://sourcebucket/ s3://destinationbucket/

for Ubuntu user :
apt-get install s3cmd
s3cmd --configure
s3cmd cp s3://sourcebucket/ s3://destinationbucket/

@kostyaev

This comment has been minimized.

Show comment Hide comment
@kostyaev

kostyaev May 29, 2017

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

@AlexThomas90210

This comment has been minimized.

Show comment Hide comment
@AlexThomas90210

AlexThomas90210 Aug 28, 2017

I was having issues with access denied even with Kalagan's solution, ended up just using aws s3 sync s3://sourcebucket/ s3://destinationbucket/ --profile destinationAccountProfile

I was having issues with access denied even with Kalagan's solution, ended up just using aws s3 sync s3://sourcebucket/ s3://destinationbucket/ --profile destinationAccountProfile

@aioue

This comment has been minimized.

Show comment Hide comment
@aioue

aioue Sep 6, 2017

what's an example dest or source bucket? Do I need the full ARN?

aioue commented Sep 6, 2017

what's an example dest or source bucket? Do I need the full ARN?

@danielricecodes

This comment has been minimized.

Show comment Hide comment
@danielricecodes

danielricecodes Nov 15, 2017

For smaller buckets, it may be easier to do two aws s3 sync commands.

aws configure (log into source bucket)
aws s3 sync s3://sourcebucket/ ~/local/path
aws configure (log into destination bucket)
aws s3 sync ~/local/path s3://destinationbucket/

I haven't tested this, but in theory something like this should work. This is not meant for extremely large bucket transfers - the direct method above is preferred since the data will transfer inside Amazon's cloud and will be crazy fast that way.

For smaller buckets, it may be easier to do two aws s3 sync commands.

aws configure (log into source bucket)
aws s3 sync s3://sourcebucket/ ~/local/path
aws configure (log into destination bucket)
aws s3 sync ~/local/path s3://destinationbucket/

I haven't tested this, but in theory something like this should work. This is not meant for extremely large bucket transfers - the direct method above is preferred since the data will transfer inside Amazon's cloud and will be crazy fast that way.

@nikolaiderzhak

This comment has been minimized.

Show comment Hide comment
@nikolaiderzhak

nikolaiderzhak Nov 23, 2017

Nowadays it seems better to stick with aws cli.

aws s3 sync s3://sourcebucket s3://destinationbucket

Also there is official doc on whole procedure: https://aws.amazon.com/premiumsupport/knowledge-center/account-transfer-s3/

nikolaiderzhak commented Nov 23, 2017

Nowadays it seems better to stick with aws cli.

aws s3 sync s3://sourcebucket s3://destinationbucket

Also there is official doc on whole procedure: https://aws.amazon.com/premiumsupport/knowledge-center/account-transfer-s3/

@webjay

This comment has been minimized.

Show comment Hide comment
@webjay

webjay Mar 7, 2018

These are the actions needed for sync, like: aws --profile PROFILENEW s3 sync s3://BUCKETOLD s3://BUCKETNEW

{
    "Version": "2012-10-17",
    "Id": "Policy1520420757463",
    "Statement": [
        {
            "Sid": "Stmt1520420753833",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::XXX:root"
            },
            "Action": [
                "s3:ListBucket",
                "s3:GetObject",
                "s3:GetObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETOLD",
                "arn:aws:s3:::BUCKETOLD/*"
            ]
        }
    ]
}

webjay commented Mar 7, 2018

These are the actions needed for sync, like: aws --profile PROFILENEW s3 sync s3://BUCKETOLD s3://BUCKETNEW

{
    "Version": "2012-10-17",
    "Id": "Policy1520420757463",
    "Statement": [
        {
            "Sid": "Stmt1520420753833",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::XXX:root"
            },
            "Action": [
                "s3:ListBucket",
                "s3:GetObject",
                "s3:GetObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETOLD",
                "arn:aws:s3:::BUCKETOLD/*"
            ]
        }
    ]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment