Skip to content

Instantly share code, notes, and snippets.

@overdrive3000
Last active June 10, 2022 15:50
Show Gist options
  • Save overdrive3000/f54210bdd22f58190939eaa109761c4a to your computer and use it in GitHub Desktop.
Save overdrive3000/f54210bdd22f58190939eaa109761c4a to your computer and use it in GitHub Desktop.
AWS CLI tips and tricks

Get the default VPC Id

aws ec2 describe-vpcs --filters "Name=isDefault,Values=true" --output text --query 'Vpcs[*].{Vpc:VpcId}'

Get all the cloudformation stack names

aws cloudformation describe-stacks --region us-east-1 --output text --query 'Stacks[*].{Stack:StackName}'

Delete a cloudformation stack with failing resources to delete

aws cloudformation delete-stack --stack-name MyStack --retain-resources "LogicalId1" "LogicalId2"

Upload a large file to S3 using multipart upload

  1. Get the MD5 sum of the file in Base64 format
openssl md5 -binary bigfile | base64
  1. Create a multipart upload job in S3
aws s3api create-multipart-upload --bucket mybucket --key xxl/bigfile --metadata md5=p6P8yDJS6V3LZVwjKH6j8g==

Where the MD5 value is the result of the step 1 command

  1. Split the files in several chunks (in this case I decided to split the file in 200 parts)
split -n 200 bigfile
  1. Upload the parts, move the original bigfile out of the folder in order to run a oneliner script to upload all the parts
count=0; for file in `ls -w1`; do (( count++ )); MD5=`openssl md5 -binary $file| base64`; aws s3api upload-part --bucket mybucket --key xxl/bigfile --part-number $count --body $file --upload-id o0pkIgTPsF21LEUUJqm_AHGQX0Cr_ztcZDiqAH3lnmiGmHgx5iqDq55PtB6bfgXRh7tA9opdV6TEQEJ84yG.aA7b6k.piXGuf4zVMhXzCcdct6Kr_gF.Hp843I6huV84 --content-md5 $MD5; done

The value of the upload-id parameter is the result of the command executed in step 3

  1. To complete the upload we need the ETag value for each part uploaded, we can retreive this by executing the following command (please adjust with your values)
aws s3api list-parts --bucket mybucket --key xxl/bigfile --upload-id o0pkIgTPsF21LEUUJqm_AHGQX0Cr_ztcZDiqAH3lnmiGmHgx5iqDq55PtB6bfgXRh7tA9opdV6TEQEJ84yG.aA7b6k.piXGuf4zVMhXzCcdct6Kr_gF.Hp843I6huV84 | jq '.Parts[]|{PartNumber: .PartNumber, ETag: .ETag}' > fileparts
  1. Finish the fileparts file by editing the JSON file, it has to look similar to this.
{
  "Parts": [
    {
      "PartNumber": 1,
      "ETag": "25911cbc9c7a078f2174d6222f91bcdc"
    },
    {
      "PartNumber": 2,
      "ETag": "955a8652bc3b8b2f33e2c4bd8344971d"
    },
    {
      "PartNumber": 3,
      "ETag": "c6a49ff8c9f7fbfb5644136c09cd80c7"
    },
    ...
  ]
}
  1. Finaly complete the multipart upload by executing
aws s3api complete-multipart-upload --multipart-upload file:///tmp/s3multipart/fileparts --bucket mybucket --key xxl/bigfile --upload-id o0pkIgTPsF21LEUUJqm_AHGQX0Cr_ztcZDiqAH3lnmiGmHgx5iqDq55PtB6bfgXRh7tA9opdV6TEQEJ84yG.aA7b6k.piXGuf4zVMhXzCcdct6Kr_gF.Hp843I6huV84

Get windows instance password and copy it to the clipboard

aws ec2 get-password-data --instance-id i-0b32d4bb2702a6e58 --region us-east-1 --priv-launch-key ~/.ssh/windows/us-east-1.pem | jq '.PasswordData' | tr  -d '\"' | pbcopy

Use xclip -selection clipboard instead of pbcopy if you're executing this command in Linux.

List a S3 object and filtering result by using --query parameter

aws s3api list-objects --bucket my-bucket --prefix my-folder --query "Contents[?contains(Key, 'pattern-to-find')]"
@abuehaze
Copy link

This is Awesome, Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment