Skip to content

Instantly share code, notes, and snippets.

@jlazic
jlazic / Get-HAProxyStats.ps1
Last active May 16, 2024 08:51
Monitor HAProxy with PRTG
# Monitoring HAProxy via CSV stats
# For detailed instructions visit http://lazic.info/josip/post/monitor-haproxy-via-prtg/
# Josip Lazic
param(
[string]$url,
[string]$monitor
);
$templates =@{
@mikesmullin
mikesmullin / watch.sh
Last active April 26, 2023 05:20
watch is a linux bash script to monitor file modification recursively and execute bash commands as changes occur
#!/usr/bin/env bash
# script: watch
# author: Mike Smullin <mike@smullindesign.com>
# license: GPLv3
# description:
# watches the given path for changes
# and executes a given command when changes occur
# usage:
# watch <path> <cmd...>
#
@joshrobb
joshrobb / iis-parse.config
Last active January 8, 2017 12:36
logstash config for parsing IIS maximal logging
#tested using http://grokdebug.herokuapp.com/
input {
tcp {
type => "iis"
port => 3333
}
}
filter {
@jordansissel
jordansissel / RESULTS.md
Created September 21, 2012 07:41
screenshot + code showing how to query logstash/elasticsearch with a graphite function.

logstash queries graphed with graphite.

Operation: Decouple whisper from graphite.

Method: Create a graphite function that does a date histogram facet query against elasticsearch for a given query string for the time period viewed in the current graph.

Reason: graphite has some awesome math functions. Wouldn't it be cool if we could use those on logstash results?

The screenshot below is using logstash to watch the twitter stream of keywords "iphone" "apple" and "samsung" - then I graph them each, so we get an idea of popularity. As a bonus, I also do a movingAverage() on the iphone curve to show you why this is awesome.

@nherment
nherment / backup.sh
Created February 29, 2012 10:42
Backup and restore an Elastic search index (shamelessly copied from http://tech.superhappykittymeow.com/?p=296)
#!/bin/bash
# herein we backup our indexes! this script should run at like 6pm or something, after logstash
# rotates to a new ES index and theres no new data coming in to the old one. we grab metadatas,
# compress the data files, create a restore script, and push it all up to S3.
TODAY=`date +"%Y.%m.%d"`
INDEXNAME="logstash-$TODAY" # this had better match the index name in ES
INDEXDIR="/usr/local/elasticsearch/data/logstash/nodes/0/indices/"
BACKUPCMD="/usr/local/backupTools/s3cmd --config=/usr/local/backupTools/s3cfg put"
BACKUPDIR="/mnt/es-backups/"
YEARMONTH=`date +"%Y-%m"`