Skip to content

Instantly share code, notes, and snippets.

@jibs
jibs / gcloud-port-forward.md
Created April 25, 2015 15:57
port forwarding with a google cloud instance

Google cloud's ssh command lets you pass standard ssh flags. To, for example, forward local port 8088 to port 8088 on a vm instance, all you need to do is:

gcloud compute  ssh --ssh-flag="-L 8088:localhost:8088"  --zone "us-central1-b" "example_instance_name"

Now browsing to localhost:8088 works as it would with standard ssh.

public class DisruptorTest2 {
public static class PiJob {
public double result;
public int sliceNr;
public int numIter;
public int partionId;
public void calculatePi() {
double acc = 0.0;
import com.twitter.finagle.http.path._
import com.twitter.finagle.http.service.RoutingService
import com.twitter.finagle.http.{Request, Response, RichHttp, Http}
import com.twitter.finagle.{Service, SimpleFilter}
import org.jboss.netty.handler.codec.http._
import org.jboss.netty.handler.codec.http.HttpResponseStatus._
import org.jboss.netty.handler.codec.http.HttpVersion.HTTP_1_1
import org.jboss.netty.buffer.ChannelBuffers.copiedBuffer
import org.jboss.netty.util.CharsetUtil.UTF_8
import com.twitter.util.Future
@jibs
jibs / remove_south_migration_history.md
Created March 29, 2013 12:45
Selectively delete south migration history in Django

To reset migrations of just a single app, say polls

from south.models import MigrationHistory
MigrationHistory.objects.filter(app_name='polls').delete()

Make sure you delete the migrations folder (at APPNAME/migrations)

./manage.py schemamigration --initial polls

@jibs
jibs / kafka_zookeeper.py
Last active December 12, 2015 05:28
Kafka zookeeper cleanup script
import sys
from kazoo.client import KazooClient
"""
Simple python script to clean up zookeeper consumer offsets after a errant kafka consumer. Only tested on Kafka 0.7.
For details on Kazoo: http://kazoo.readthedocs.org/en/latest/index.html
MIT License.
"""
@jibs
jibs / backup.sh
Created January 23, 2013 09:41 — forked from karussell/backup.sh
# TO_FOLDER=/something
# FROM=/your-es-installation
DATE=`date +%Y-%m-%d_%H-%M`
TO=$TO_FOLDER/$DATE/
echo "rsync from $FROM to $TO"
# the first times rsync can take a bit long - do not disable flusing
rsync -a $FROM $TO
# now disable flushing and do one manual flushing
@jibs
jibs / backup.sh
Created January 23, 2013 09:41 — forked from nherment/backup.sh
#!/bin/bash
# herein we backup our indexes! this script should run at like 6pm or something, after logstash
# rotates to a new ES index and theres no new data coming in to the old one. we grab metadatas,
# compress the data files, create a restore script, and push it all up to S3.
TODAY=`date +"%Y.%m.%d"`
INDEXNAME="logstash-$TODAY" # this had better match the index name in ES
INDEXDIR="/usr/local/elasticsearch/data/logstash/nodes/0/indices/"
BACKUPCMD="/usr/local/backupTools/s3cmd --config=/usr/local/backupTools/s3cfg put"
BACKUPDIR="/mnt/es-backups/"
YEARMONTH=`date +"%Y-%m"`
@jibs
jibs / elastic_search_copy.py
Created December 17, 2012 15:54
copy all data in elasticsearch to a local file using pyes
import pyes
import simplejson as json
SOURCE =['SERVER:9200']
sconn = pyes.ES(SOURCE)
def scroll_gen(index):
q = '{"query":{"match_all":{}}, "size": 15000}'
s = sconn.search_raw(json.loads(q), scroll="5m", indices=index)
@jibs
jibs / route53_migrate.py
Created September 10, 2012 23:13
Migrate domain from godaddy to aws route53 via boto (mostly)
import boto
from boto.route53.record import ResourceRecordSets
from collections import defaultdict
zone_name = 'yourdomain.com.'
zone_id = "ZONEID-HERE"
conn = boto.connect_route53()
zone = conn.get_hosted_zone(zone_id)
@jibs
jibs / bash
Created July 10, 2012 09:35 — forked from dillera/bash
Installing graphite Ubuntu 11.10
####################################
# BASIC REQUIREMENTS
# http://graphite.wikidot.com/installation
# http://geek.michaelgrace.org/2011/09/how-to-install-graphite-on-ubuntu/
# Last tested & updated 10/13/2011
####################################
sudo apt-get update
sudo apt-get upgrade