Skip to content

Instantly share code, notes, and snippets.

@xxxVxxx
xxxVxxx / ec2-metadata
Created March 4, 2022 02:36 — forked from yaronf/ec2-metadata
ec2-metadata: a simple tool for exploring Amazon EC2 metadata, extended with network and VPC info
#!/bin/bash
#
#########################################################################
#This software code is made available "AS IS" without warranties of any #
#kind. You may copy, display, modify and redistribute the software #
#code either by itself or as incorporated into your code; provided that #
#you do not remove any proprietary notices. Your use of this software #
#code is at your own risk and you waive any claim against Amazon #
#Digital Services, Inc. or its affiliates with respect to your use of #
#this software code. (c) 2006-2007 Amazon Digital Services, Inc. or its #
@xxxVxxx
xxxVxxx / aws_delete-default-vpc.sh
Created February 14, 2022 03:08 — forked from jokeru/aws_delete-default-vpc.sh
Script to delete all AWS default VPCs from all regions using AWS CLI
#!/usr/bin/env bash
if [ "$AWS_PROFILE" = "" ]; then
  echo "No AWS_PROFILE set"
  exit 1
fi
for region in $(aws ec2 describe-regions --region eu-west-1 | jq -r .Regions[].RegionName); do
@xxxVxxx
xxxVxxx / gist:1bb9657f33ef4fa403f393c7eb30250c
Last active August 20, 2019 05:59
snippets of useful information from Accelerate - Gene Kim, Jez Humble, Nicole Forsgren
In our search for measures of delivery performance that meet these criteria, we settled on four:
delivery lead time, deployment frequency, time to restore service, and change fail rate.
In order to analyze delivery performance across the cohort we surveyed, we used a technique called cluster analysis. Cluster analysis is a foundational technique in statistical data analysis that attempts to group responses so that responses in the same group are more similar to each other than to responses in other groups. Each measurement is put on a separate dimension, and the clustering algorithm attempts to minimize the distance between all cluster members and maximize differences between clusters. This technique has no understanding of the semantics of responses—in other words, it doesn’t know what counts as a “good” or “bad” response for any of the measures
Verifying my Blockstack ID is secured with the address 19k5tPgzwijsZHNtzh8FvNp7epZZCSDcz9 https://explorer.blockstack.org/address/19k5tPgzwijsZHNtzh8FvNp7epZZCSDcz9
# Install dependencies
#
# * checkinstall: package the .deb
# * libpcre3, libpcre3-dev: required for HTTP rewrite module
# * zlib1g zlib1g-dbg zlib1g-dev: required for HTTP gzip module
apt-get install checkinstall libpcre3 libpcre3-dev zlib1g zlib1g-dbg zlib1g-dev && \
mkdir -p ~/sources/ && \
# Compile against OpenSSL to enable NPN
@xxxVxxx
xxxVxxx / fix_github_https_repo.sh
Created June 17, 2016 08:06 — forked from m14t/fix_github_https_repo.sh
Convert HTTPS github clones to use SSH
#/bin/bash
#-- Script to automate https://help.github.com/articles/why-is-git-always-asking-for-my-password
REPO_URL=`git remote -v | grep -m1 '^origin' | sed -Ene's#.*(https://[^[:space:]]*).*#\1#p'`
if [ -z "$REPO_URL" ]; then
echo "-- ERROR: Could not identify Repo url."
echo " It is possible this repo is already using SSH instead of HTTPS."
exit
fi
#latest release of veewee requires ruby > 2.1 and gem > 1.9
sudo apt-add-repository ppa:brightbox/ruby-ng
sudo apt-get update
sudo apt-get install ruby2.3 ruby2.3-dev
#Lets setup veewee now:
git clone https://github.com/jedi4ever/veewee.git
cd veewee
@xxxVxxx
xxxVxxx / bootstrap_emr.tf
Created May 3, 2016 15:29 — forked from tsailiming/bootstrap_emr.tf
A terraform script to bootstrap EMR.
// EMR is not supported by terraform yet
// https://github.com/hashicorp/terraform/issues/2098
// This script will bootstrap the necessary VPC and related configs first.
provider "aws" {
#access_key = "ACCESS_KEY_HERE"
#secret_key = "SECRET_KEY_HERE"
region = "ap-southeast-1"
}
// get old
s3cmd get s3://elasticmapreduce/samples/spark/0.8.1/spark-0.8.1-emr.tgz
tar -xvf spark-0.8.1-emr.tgz
// get newer
curl "http://d3kbcqa49mib13.cloudfront.net/spark-1.0.0-bin-hadoop1.tgz" -o "spark-1.0.0-bin-hadoop1.tgz" # Grab the latest version of the prebuilt Spark jars. We grabbed Spark 1.0.0 with Hadoop 1 , spark-1.0.0-bin-hadoop1.tgz, from the Spark Downloads page
tar -xvf spark-1.0.0-bin-hadoop1.tgz
// old metrics properties files
#!/usr/bin/python
import sys
try:
import pexpect
except ImportError:
sys.stderr.write("\nPexpect is not installed. You can do so with 'pip install pexpect' :)\n\n")
exit(1)