Skip to content

Instantly share code, notes, and snippets.

View anuragmathur1's full-sized avatar

Anurag Mathur anuragmathur1

  • Sydney, AU
View GitHub Profile
@anuragmathur1
anuragmathur1 / increase_volume.sh
Created October 23, 2020 00:44
Increase volume and Reboot the instance. You can, however, do without rebooting as well with some additional steps
pip install --user --upgrade boto3
export instance_id=$(curl -s http://169.254.169.254/latest/meta-data/instance-id)
python -c "import boto3
import os
from botocore.exceptions import ClientError
ec2 = boto3.client('ec2')
volume_info = ec2.describe_volumes(
Filters=[
{
'Name': 'attachment.instance-id',
@anuragmathur1
anuragmathur1 / check-volume-encryption.sh
Created February 5, 2020 23:19
Check if the EBS volumes are encrypted on instances with a given tag
for volume in `aws ec2 describe-instances --filters Name=tag:tag-key,Values=tag-value --query 'Reservations[*].Instances[*].BlockDeviceMappings[*].Ebs.VolumeId' --output=text`
do
aws ec2 describe-volumes --volume-ids $volume --query 'Volumes[*].Encrypted' --output=text
done
## This is useful in case when you want your local docker to point to the minikube VM.
## So after running the below commands, you docker commands will return from the minikube vm.
## `docker images` will no longer report images on your local mac workstation but the images on the minikube VM.
## you may just run the eval command as in the last line to do all at once.
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.106:2376"
export DOCKER_CERT_PATH="/Users/anurag.mathur/.minikube/certs"
# Run this command to configure your shell:
# eval $(minikube docker-env)
@anuragmathur1
anuragmathur1 / pull_all_repos.sh
Created October 4, 2019 03:57
Pull all gitlab repos from the group
GITLAB_GROUP="<group-name>"
GITLAB_PRIVATE_TOKEN="<token>"
GITLAB_ADDRESS="gitlab.<name>"
for i in `curl -s "https://$GITLAB_ADDRESS/api/v4/groups/$GITLAB_GROUP/projects?private_token=$GITLAB_PRIVATE_TOKEN&amp;per_page=999" | \
grep -o "\"ssh_url_to_repo\":[^ ,]\+" | \
awk -F ':' '{gsub(/"/, "", $2); gsub(/"/, "", $3); print $2":"$3}'`; do git clone $i; done
@anuragmathur1
anuragmathur1 / delete-snapshots.py
Created October 1, 2019 05:24
Delete EBS snapshots that are not attached to any AMI and are at least 90 days old
#!/usr/bin/python
## The script deletes the snapshots that are :
## - not attached to an AMI
## - are atleast 90 days old
## Output :
## - soem csv files if a report is required
## - the return values ( return_code && request_id from the delete command )
## Usage
@anuragmathur1
anuragmathur1 / install-aws-cdk.sh
Created July 17, 2019 23:48
Install aws cdk - ubuntu18
apt-get install python3-pip -y
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.6 1
sudo update-alternatives --config python
sudo update-alternatives --install /usr/bin/pip pip /usr/bin/pip3 1
sudo update-alternatives --config pip
pip install aws-cdk.cdk
@anuragmathur1
anuragmathur1 / delete_all_s3_Bucket_versions.py
Created May 7, 2019 05:02
You can not use cli to delete all versions of an object in a bucket. That will prevent you from removing the bucket as well. The below 4 lines will delete all versions of all all objects in the given s3 bucket.
#!/usr/bin/env python
import boto3
s3 = boto3.resource('s3')
bucket = s3.Bucket('<<Bucket-name-here>>')
bucket.object_versions.all().delete()
@anuragmathur1
anuragmathur1 / run_remote_jenkins_job.rb
Created April 13, 2018 05:34
Example : Ruby script to invoke Jenkins job remotely
require 'jenkins_api_client'
SERVER_IP = 'jenkins.dev.example.com'
$build = JenkinsApi::Client.new(:server_ip => "#{SERVER_IP}",:username => 'admin', :password => 'admin')
$opts = {'build_start_timeout' => 60, 'cancel_on_build_start_timeout' => true}
$build.job.build("test-project",{:parameter1=>"value-1",\
:parameter2=>"value-2",\
:parameter3=>"value-3",\
:parameter4=>"value-4"} || {}, $opts)
@anuragmathur1
anuragmathur1 / cleanup_ami_by_date.py
Last active December 21, 2017 03:58
Delete AMIs and associated snapshots created on a given date
#!/usr/bin/python
import boto3
client = boto3.client('ec2')
account_number = boto3.client('sts').get_caller_identity()['Account']
response = client.describe_images(Owners=[account_number])
for image in response['Images']:
if "2017-08-21" in image['CreationDate']:
image_id = image['ImageId']
snapshot_id = image['BlockDeviceMappings'][0]['Ebs']['SnapshotId']
@anuragmathur1
anuragmathur1 / find_all_ec2.sh
Created September 20, 2017 06:16
Find all instances and list their Private IP, instance-id and Name
# Find all instances and list their Private IP, instance-id and Name
# You may specify more filters like the vpc
# Name=vpc-id,Values=vpc-xxxxxxxx
aws ec2 describe-instances --filters --query 'Reservations[].Instances[].[PrivateIpAddress,InstanceId,Tags[?Key==`Name`].Value[]]' --output text --profile=elmodev | sed '$!N;s/\n/ /'