Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Copy between S3 buckets w/ different accounts

This is a mix between two sources:

basically the first resource is great but didn't work for me: I had to remove the trailing "/*" in the resource string to make it work. I also noticed that setting the policy on the source bucket was sufficient. In the end these are the exact steps I followed to copy data between two buckets on two accounts

Basically the idea there is:

  • we allowe the destination account to read the source bucket (in the console for the source account)
  • we log as the destination and start the copy

Step 1 grab the user name for the destination account

log into AWS with the destination account and go to "My Account" https://portal.aws.amazon.com/gp/aws/manageYourAccount The account number is on the top right below the search bar (under "Welcome XXX") and is like 1234-1234-1234 (12 digits)

For the rest I also assume you have and API key/password, if not:

  • go to the the console https://console.aws.amazon.com

  • click on your name on the top right > Security Credentials

  • Expand "Access Keys" and click on "Create New Access Key" You then obtain a file that looks like that:

    AWSAccessKeyId=AAAAAAAAAA AWSSecretKey=abababababababababababaabbabab

The first value (AAAAAAAAAA) is the API key, the second (abababababababababababaabbabab) is the password.

Step 2 create the policy for the source bucket

log into AWS with the source account and go to the AWS console for S3

select your bucket > Properties (on the right) > Permissions > Edit bucket policy You then see a dialog named "Bucket Policy Editor"

on the bottom left of the dialog select "AWS policy generator". It will open a new page with a form, set the following values:

  • Select Type of Policy: S3 Bucket Policy
  • Effect: Allow
  • Principal: arn:aws:iam::123412341234:root (123412341234 is the destination account number without the dashes)
  • AWS Service: Amazon S3
  • Actions: click "All Actions"
  • Amazon Resource Name: arn:aws:s3:::source-bucket (replace "source-bucket" with your source bucket name)

The click "Add Statement" and then "Generate Policy" You then see a dialog with contents similar to:

{
  "Id": "Policy1383062241257",
  "Statement": [
    {
      "Sid": "Stmt1383062239775",
      "Action": "s3:*",
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::source-bucket",
      "Principal": {
        "AWS": [
          "arn:aws:iam::123412341234:root"
        ]
      }
    }
  ]
}

cut and paste the policy in the dialog of the previous page (the "Bucket Policy Editor") and click "Save"

Step 3 copy using s3cmd

Install s3cmd, on the Mac:

brew install s3cmd

then configure your credentials for the destination account:

s3cmd --configure

It will ask for your API key and corresponding password, then a password to encode your credentials. Andswer yes (y) to test the connection and save the configuration.

now you can copy:

s3cmd sync --skip-existing --recursive s3://source-bucket s3://destination-bucket
@erickrause

This comment has been minimized.

Copy link

erickrause commented May 12, 2015

Thank you very much for this. Saved me a ton of time today.

@JulienSansot

This comment has been minimized.

Copy link

JulienSansot commented May 15, 2015

To make it work I had to also give access to source-bucket/*
Like this :
"Resource": [
"arn:aws:s3:::source-bucket",
"arn:aws:s3:::source-bucket/*"
]

@tboyko

This comment has been minimized.

Copy link

tboyko commented Sep 16, 2015

+1 the comment by @Kalagan

@iamalpani

This comment has been minimized.

Copy link

iamalpani commented Feb 20, 2016

Really quick & helpful. Thanks.

@ssharmaolx

This comment has been minimized.

Copy link

ssharmaolx commented Mar 11, 2016

aws cli works well for me ,I had 5TB data to be copied

@bhgames

This comment has been minimized.

Copy link

bhgames commented Apr 4, 2016

Still getting 403s with this.

@johnt337

This comment has been minimized.

Copy link

johnt337 commented May 5, 2016

+1 @Kalagan regarding their comment. Bucket logs still tend to throw errors and I had to grant Action: "*".
Ultimately had better luck with aws s3 sync. Great gist thanks!

@lao

This comment has been minimized.

Copy link

lao commented Jun 17, 2016

+1 the comment by @Kalagan

@Laxman-SM

This comment has been minimized.

Copy link

Laxman-SM commented Aug 26, 2016

+1 the comment by @Kalagan

@wenhongqiang

This comment has been minimized.

@kheengz

This comment has been minimized.

Copy link

kheengz commented Mar 9, 2017

works for me...thanks guys

@zahidulislam012

This comment has been minimized.

Copy link

zahidulislam012 commented May 29, 2017

for Ubuntu user :
apt-get install s3cmd
s3cmd --configure
s3cmd cp s3://sourcebucket/ s3://destinationbucket/

@kostyaev

This comment has been minimized.

Copy link

kostyaev commented May 29, 2017

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

@AlexThomas90210

This comment has been minimized.

Copy link

AlexThomas90210 commented Aug 28, 2017

I was having issues with access denied even with Kalagan's solution, ended up just using aws s3 sync s3://sourcebucket/ s3://destinationbucket/ --profile destinationAccountProfile

@aioue

This comment has been minimized.

Copy link

aioue commented Sep 6, 2017

what's an example dest or source bucket? Do I need the full ARN?

@danielricecodes

This comment has been minimized.

Copy link

danielricecodes commented Nov 15, 2017

For smaller buckets, it may be easier to do two aws s3 sync commands.

aws configure (log into source bucket)
aws s3 sync s3://sourcebucket/ ~/local/path
aws configure (log into destination bucket)
aws s3 sync ~/local/path s3://destinationbucket/

I haven't tested this, but in theory something like this should work. This is not meant for extremely large bucket transfers - the direct method above is preferred since the data will transfer inside Amazon's cloud and will be crazy fast that way.

@nikolaiderzhak

This comment has been minimized.

Copy link

nikolaiderzhak commented Nov 23, 2017

Nowadays it seems better to stick with aws cli.

aws s3 sync s3://sourcebucket s3://destinationbucket

Also there is official doc on whole procedure: https://aws.amazon.com/premiumsupport/knowledge-center/account-transfer-s3/

@webjay

This comment has been minimized.

Copy link

webjay commented Mar 7, 2018

These are the actions needed for sync, like: aws --profile PROFILENEW s3 sync s3://BUCKETOLD s3://BUCKETNEW

{
    "Version": "2012-10-17",
    "Id": "Policy1520420757463",
    "Statement": [
        {
            "Sid": "Stmt1520420753833",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::XXX:root"
            },
            "Action": [
                "s3:ListBucket",
                "s3:GetObject",
                "s3:GetObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETOLD",
                "arn:aws:s3:::BUCKETOLD/*"
            ]
        }
    ]
}
@pidugusundeep

This comment has been minimized.

Copy link

pidugusundeep commented May 30, 2018

iam still getting a 403 error access denied how do i sort this out ?

@harish6123

This comment has been minimized.

Copy link

harish6123 commented Jul 20, 2018

i am getting error Invalid principal in policy,can you please suggest me on this.

@webjay

This comment has been minimized.

Copy link

webjay commented Aug 20, 2018

Make sure the Principal is the user running the operation.

In my case the user is from the remote account:

            "Principal": {
                "AWS": "arn:aws:iam::123412341234:user/myusername"
            },
@cecchisandrone

This comment has been minimized.

Copy link

cecchisandrone commented Aug 31, 2018

Better to use aws cli

@javiermurillo

This comment has been minimized.

Copy link

javiermurillo commented Jan 10, 2019

Using aws-cli:
apt-get update
apt install awscli
aws configure
aws s3 sync s3://sourcebucket/ s3://destinationbucket/

Thank you @kostyaev, you save me a lot of time!

@pratapatl

This comment has been minimized.

Copy link

pratapatl commented Feb 14, 2020

Can anyone share the expected throughput copying from S3 to S3 between two account in the same region?

@Sober-bug

This comment has been minimized.

Copy link

Sober-bug commented Mar 31, 2020

Hi, if I want to run the sync command in source account. What should I do?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.