Skip to content

Instantly share code, notes, and snippets.

def kv_diff(A, B, whitelist=[]):
'''
Function can be used on elasticsearch mappings to identify variations.
Returns a diff.
Eg:
template = requests.get(es_url + '/_template/mytemplate', auth=(es_user,es_pass)).json()['projects']['mappings']
mapping = requests.get(es_url + '/foo/_mapping' % project_id, auth=(es_user,es_pass)).json()['foo']['mappings']
kv_diff(template, mapping)
'''
try:

Keybase proof

I hereby claim:

  • I am charlesmims on github.
  • I am charlesmims (https://keybase.io/charlesmims) on keybase.
  • I have a public key whose fingerprint is 525B E363 BD9A A014 1E84 0CF2 864B 6A7A 5698 7D7F

To claim this, I am signing this object:

$ curl -XPUT 10.0.3.160:9200/_snapshot/backup -d '{"type":"s3","settings":{"secret_key":"<super secret key>","access_key":"<access key>","max_restore_bytes_per_sec":"5120mb","bucket":"<my-backup-bucket>","max_snapshot_bytes_per_sec":"5120mb"}}'
{"error":{"root_cause":[{"type":"process_cluster_event_timeout_exception","reason":"failed to process cluster event (put_repository [backup]) within 30s"}],"type":"process_cluster_event_timeout_exception","reason":"failed to process cluster event (put_repository [backup]) within 30s"},"status":503}
from the logs:
[2016-02-11 20:07:15,683][INFO ][rest.suppressed ] /_snapshot/backup Params: {repository=backup}
ProcessClusterEventTimeoutException[failed to process cluster event (put_repository [backup]) within 30s]
at org.elasticsearch.cluster.service.InternalClusterService$2$1.run(InternalClusterService.java:343)
#!/usr/bin/env python
# make this file executable and save it in /usr/local/bin/.
# pipe gpg ascii-armored code to it to strip it into a simple string
# pipe the simple string to it to get the original formatting back which
# you can pipe into gpg.
import fileinput
message = ''
#!/usr/bin/env python
"""
Bytes-to-human / human-to-bytes converter.
Based on: http://goo.gl/kTQMs
Working with Python 2.x and 3.x.
Author: Giampaolo Rodola' <g.rodola [AT] gmail [DOT] com>
License: MIT
"""
Add the following to your ~/.bashrc and your bash history will be saved to a coherent, time-stamped, persistant history file.
shopt -s histappend
export HISTIGNORE=ignoreboth
export HISTFILESIZ=1000000
export HISTSIZE=1000000
export HISTTIMEFORMAT='%F %T '
export PROMPT_COMMAND='history -a'
#!/usr/bin/env python
# @charlesmims 2015
# Maintains a number of coreOS hosts running in ec2. Can be used to scale up by increasing NUM_NODES,
# but in present state will not scale down cleanly.
import boto.ec2
import time
from os import environ
from collections import defaultdict
Elastic.co's past-releases section of their website is unuseable.
I wrote this script to parse all the pages of that site and just spit out all the links
so you can just search for what you want.
Isn't it ironic not being able to search for something on elasticsearch's website?
#!/usr/bin/env python
import requests
import re
@charlesmims
charlesmims / mesos_cadvisor_prometheus_grafana
Created December 6, 2016 22:27
monitoring docker containers in mesos with prometheus and cadvisor and grafana
# This marathon.json deploys cadvisor to each node and exposes it on port 3002.
cadvisor/marathon.json
{
"id": "cadvisor",
"cpus": 0.1,
"mem": 100,
"disk": 0,
"instances": 8, // equal to number of agent nodes you have
"constraints": [["hostname", "UNIQUE"]],
#!/bin/bash
# 2017-03-27 charles@mims.io
if [ $# -lt 1 ]
then
cat << HELP
dockertags -- list all tags for a Docker image