Skip to content

Instantly share code, notes, and snippets.

@arthuralvim
Forked from victorfsf/awscli.md
Last active October 30, 2018 19:03
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save arthuralvim/49ba96ce8d888dc1f9fda1d7b110459d to your computer and use it in GitHub Desktop.
Save arthuralvim/49ba96ce8d888dc1f9fda1d7b110459d to your computer and use it in GitHub Desktop.
Downloading and uploading from/to a S3 Bucket using AWS CLI

Installing & Configuring

$ sudo pip install awscli (or: sudo apt-get install awscli)
$ aws configure

You'll need to fill the following settings:

AWS Access Key ID [None]: 
AWS Secret Access Key [None]: 
Default region name [None]: 
Default output format [None]: 

Downloading from S3

$ aws s3 sync s3://bucket_name .

a more advanced example

$ aws s3 sync s3://bucket_name/some/path/ local/path/ --exclude "*" --include "*string*" --dryrun

List of files from S3

$ aws s3api list-objects --bucket "bucket_name" --prefix "some/prefix/path/" --query "Contents[?LastModified>='yyyy-mm-dd].{Key: Key}"

List all files with summarized information about files

aws s3 ls s3://bucket_name/some/path/to/files/ --recursive --human-readable --summarize > output_file.txt

Uploading to S3

$ aws s3 sync files/to/upload/path s3://bucket_name

Copying from Bucket to Bucket

You'll need to set the following policies for both Buckets:

Source Bucket policy statements:

{
	"Sid": "Stmt1357935647218",
	"Effect": "Allow",
	"Principal": {
		"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
	},
	"Action": "s3:ListBucket",
	"Resource": "arn:aws:s3:::SourceBucket"
},
{
	"Sid": "Stmt1357935676138",
	"Effect": "Allow",
	"Principal": {
		"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
	},
	"Action": "s3:GetObject",
	"Resource": "arn:aws:s3:::SourceBucket/*"
},

Destination Bucket policy statements:

{
	"Sid": "Stmt1357935647218",
	"Effect": "Allow",
	"Principal": {
		"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
	},
	"Action": "s3:ListBucket",
	"Resource": "arn:aws:s3:::DestinationBucket"
},
{
	"Sid": "Stmt1357935676138",
	"Effect": "Allow",
	"Principal": {
		"AWS": "arn:aws:iam::XXXXXXXXXXXX:user"
  },
	"Action": "s3:PutObject",
	"Resource": "arn:aws:s3:::DestinationBucket/*"
}

Once you set these policies, just run:

$ aws s3 cp s3://SourceBucket/ s3://DestinationBucket/ --recursive
  • The --recursive argument will download everything inside the specified folder. Without this argument, you'd have to download each file, one by one.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment