Skip to content

Instantly share code, notes, and snippets.

View TonyFNZ's full-sized avatar

Tony Fendall TonyFNZ

View GitHub Profile
@TonyFNZ
TonyFNZ / dnsupdate.sh
Created October 9, 2016 20:36
Script to update Route53 with the current public IP of an instance
#!/bin/bash
hosted_zone_id="<your Route53 hosted zone id>"
domain_name="<your domain name>"
# Abort script on any errors
set -e
# Get new IP address
ip_address=`curl http://169.254.169.254/latest/meta-data/public-ipv4`
@TonyFNZ
TonyFNZ / aws-all-regions.sh
Created August 4, 2019 23:19
The following code can be added to your `.bashrc` file to make it easy to run an AWS CLI command across all regions in your account. This is useful when searching for stray resources that may need to be cleaned up.
################################################################################################
# Add the following code to your .bashrc file to add the all-regions command to your terminal
# This code will run the specified command once for each AWS region you have access to
################################################################################################
all-regions () {
for region in `aws ec2 describe-regions --query "Regions[].RegionName" --output text`
do
echo "Region: $region..."
eval "$@ --region $region"
done
@TonyFNZ
TonyFNZ / s3-to-s3-cross-accounts.py
Created October 18, 2018 01:05
This script will download a file from a S3 bucket in one AWS account and stream it directly into another S3 bucket in another account. File is streamed as it arrives, so memory usage is low (typically <100MB)
#! /bin/python
# This script will download a file from a S3 bucket in one AWS account
# and stream it directly into another S3 bucket in another account.
# File is streamed as it arrives, so memory usage is low (typically <100MB)
import boto3
# Create two S3 clients, one in each AWS account
# If wanting to use an instance role for one client, remove the access key parameters
@TonyFNZ
TonyFNZ / url-to-s3.py
Created October 17, 2018 22:41
This script will download a file from a URL and stream it directly into AWS S3 without persisting the file to local disk. File is streamed as it arrives, so memory usage is low (typically <100MB)
#! /bin/python
# This script will download a file from a URL and stream it directly
# into AWS S3 without persisting the file to local disk.
# File is streamed as it arrives, so memory usage is low (typically <100MB)
import boto3, urllib2
source_url = 'http://<Your URL Here>'
#!/bin/bash
tempdir="blog-export/"
editdomain=editor.example.com
publishdomain=blog.example.com
# fail on any errors
set -e
# clean up previous run