Instantly share code, notes, and snippets.

View sns.py
#!//opt/boxen/homebrew/bin/python
import boto.sns
import json
REGION = 'us-west-2'
TOPIC = '<ARN>'
URL = '<Body of Message in this example I used a url>'
View sqs.py
#!//opt/boxen/homebrew/bin/python
import boto.sqs
from boto.sqs.message import RawMessage
import json
import time
import requests
REGION = 'us-west-2'
View hhvm_rpm.md
View hubot-scripts.json
["redis-brain.coffee", "shipit.coffee", "deadline.coffee", "xkcd.coffee", "working-on.coffee", "zen.coffee", "hello.coffee", "jenkins.coffee" ]
View nepho_nosetests.md
from cement.core import handler, hook, foundation
from cement.utils import test
from nepho import cli
from nepho.cli.base import Nepho

class MyTestApp(Nepho):
    class Meta:
        # Load the base Nepho cement controller
        app = cli.base.Nepho()
View gist:7137760

Whats new in Havana

Networking Component - Neutron (Cisco) | January -> Zero to current knowledge

Cisco Nexus driver FireWall as a Service … waa?? FWaaS demo

View splunktalk_nagios_reports.md
index=nagios (nagiosevent="SERVICE NOTIFICATION" ) OR (nagiosevent="HOST NOTIFICATION" ) ( user_id=$userid$)|lookup local=t nagios-hostgroupmembers host_name AS src_host | convert ctime(_time) as time | eval Name=coalesce(name,hostnotification) |transaction delim=
src_host, nagiosevent | table time,eventcount,src_host,hostgroup,user_id,Name,reason,nagiosevent
View splunktalk_trackamac.md

I love this search, it's got a subsearch, from an input file, lookup, eval.. and a field extraction, it's got it all!

index=dhcp eventtype="dhcpd_server" NOT DHCPEXPIRE [| inputlookup mac_tracking.csv | fields mac ] | rex field=_raw "DHCP(ACK on|REQUEST for) (?<clientip>(?<!\d)(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})(?!\d)) (to|for)" | rename clientip as host | lookup huis host | eval Link="http://map.harvard.edu/?bld=".bld_root | rename  huid as HUID, mac as "MAC Address", mac_status as "Laptop Status", match_string as "Location", ip as "IP Address", src_translated_ip as "External IP Address" | transaction _time,mac| table _time, HUID, "Location", "MAC Address", "IP Address", "External IP Address","Laptop Status", Link
View splunktalk_sasl_logins.md

Search string used to identify a user who has logged in an excessive amount of times outside of the standard deviation

index=os ( sourcetype=syslog OR sourcetype=postfix_syslog) sasl_method="LOGIN" | stats count(sasl_username) as usercount by sasl_username, _time | sort - usercount | eventstats avg(usercount) as avg_usercount stdev(usercount) as std_usercount |convert ctime(_time) | stats sum(usercount) as usercount by sasl_username, avg_usercount, std_usercount | where usercount>(900*avg_usercount + std_usercount)| rename avg_usercount as "Avg Count of Logins for all Users", std_usercount as "Standard Deviation of Logins for all Users", usercount as "Count of Logins"
View splunktalk_predict.md

This is search we use to try and product future Splunk Throughput

`index_usage_macro` | bucket _time span=1d | stats sum(kb) as kb by series, _time | timechart span=1d per_day(eval(kb/1024/1024)) as GB | predict upper95=High lower95=low future_timespan=180 algorithm=LLT GB as "Predicted"