Skip to content

Instantly share code, notes, and snippets.

View zsiddique's full-sized avatar

Zuhaib Siddique zsiddique

  • Helix
  • San Francisco, CA
View GitHub Profile

Keybase proof

I hereby claim:

  • I am zsiddique on github.
  • I am zuhaib (https://keybase.io/zuhaib) on keybase.
  • I have a public key whose fingerprint is 3EEF 8B15 2AAB F954 C26D 1678 A9F5 BA1D 4DD9 3696

To claim this, I am signing this object:

#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""pycache -- cache a python package from PyPI on S3.
A simple script to collect a cache of packages locally and sync them up to an S3 bucket, using directories as namespaces so that different projects can have different dependencies.
This is just about the simplest thing that could possibly work.
"""
import warnings
warnings.filterwarnings('ignore')
filter {
grok {
'type' => 'syslog'
'pattern' => ["<%{POSINT:syslog_pri}>%{TIMESTAMP_ISO8601:syslog_timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:syslog_hostname} %{SYSLOGPROG:syslog_program}: %{GREEDYDATA:syslog_message}"]
'add_field' => ["received_at", "%{@timestamp}"]
'add_field' => ["received_from", "%{@source_host}"]
}
syslog_pri {
{
"template": "logstash-*",
"settings" : {
"number_of_shards" : 1,
"number_of_replicas" : 0,
"index" : {
"query" : { "default_field" : "@message" },
"store" : { "compress" : { "stored" : true, "tv": true } }
}
},
require 'json'
require 'time'
require 'tire'
# File to read
file = 'test.txt'
# Field in which time value can be found
timefield = '@timestamp'
2012-12-14 01:33:01,218 DEBUG - QueuingThread - incomingJobStream yielding normal job 90b7c00c-e542-4436-a4c0-7024f2121214
2012-12-14 01:33:01,219 DEBUG - Thread-2 - getJson 90b7c00c-e542-4436-a4c0-7024f2121214
2012-12-14 01:33:01,225 INFO - Thread-2 - starting job: 90b7c00c-e542-4436-a4c0-7024f2121214
2012-12-14 01:33:01,242 INFO - QueuingThread - there is nothing to do. Sleeping for 7 seconds
2012-12-14 01:33:01,278 INFO - Thread-2 - finishing successful job: 90b7c00c-e542-4436-a4c0-7024f2121214
2012-12-14 01:33:01,297 DEBUG - Thread-2 - saved processed- 90b7c00c-e542-4436-a4c0-7024f2121214
2012-12-14 01:33:05,243 DEBUG - MainThread - updating processor registration
@zsiddique
zsiddique / sensu_failing_2
Created November 29, 2012 00:47
sensu_failing_2
[2012-11-29T00:46:48+00:00] INFO: Processing sensu_base_config[sensu] action create (sensu::default line 73)
================================================================================
Error executing action `create` on resource 'sensu_base_config[sensu]'
================================================================================
Chef::Exceptions::ValidationFailed
----------------------------------
Option content must be a kind of Hash! You passed [["redis", {"host"=>"localhost", "port"=>6379}], ["rabbitmq", {"host"=>"localhost", "vhost"=>"/sensu", "password"=>"password", "user"=>"sensu", "port"=>5672}], ["dashboard", {"password"=>"secret", "user"=>"admin", "port"=>8080}], ["api", {"host"=>"localhost", "port"=>4567}]].
@zsiddique
zsiddique / sensu_fail
Created November 28, 2012 20:46
sensu failing
[2012-11-28T21:45:32+01:00] WARN: Setting Sensu RabbitMQ port to 5672 as you have disabled SSL.
================================================================================
Recipe Compile Error in /srv/chef/file_store/cookbooks/sensu-monitor/recipes/master.rb
================================================================================
SystemStackError
----------------
stack level too deep
export AWS_ENV='NAME'
export AWS_ACCESS_KEY_ID='AWSKEY'
export AWS_SECRET_ACCESS_KEY='AWS_SECRETE_KEY'
export EC2_HOME=~/.ec2
export PATH=$PATH:$EC2_HOME/bin
export EC2_PRIVATE_KEY=`ls $EC2_HOME/NAME/pk-*.pem`
export EC2_CERT=`ls $EC2_HOME/NAME/cert-*.pem`
export JAVA_HOME=/usr
export AWS_AUTO_SCALING_HOME=/Users/zuhaib/AutoScaling-1.0.39.0
export PATH=$PATH:$AWS_AUTO_SCALING_HOME/bin
@zsiddique
zsiddique / recipe.rb
Created August 19, 2011 03:48 — forked from peplin/recipe.rb
S3 File Resource for Chef
# Source accepts the protocol s3:// with the host as the bucket
# access_key_id and secret_access_key are just that
s3_file "/var/bulk/the_file.tar.gz" do
source "s3://your.bucket/the_file.tar.gz"
access_key_id your_key
secret_access_key your_secret
owner "root"
group "root"
mode 0644
end