Skip to content

Instantly share code, notes, and snippets.

@barmstrong
barmstrong / gist:1323865
Created October 28, 2011 23:50
processing large csv files in ruby
class ZendeskTicketsJob
extend Resque::Plugins::ExponentialBackoff
@queue = :low
FIELDS = ['zendesk_id', 'requester_id', 'assignee_id', 'group', 'subject', 'tags', 'status', 'priority', 'via', 'ticket_type', 'created_at', 'assigned_at', 'solved_at', 'resolution_time', 'satisfaction', 'group_stations', 'assignee_stations', 'reopens', 'replies', 'first_reply_time_in_minutes', 'first_reply_time_in_minutes_within_business_hours', 'first_resolution_time_in_minutes', 'first_resolution_time_in_minutes_within_business_hours', 'full_resolution_time_in_minutes', 'full_resolution_time_in_minutes_within_business_hours', 'agent_wait_time_in_minutes', 'agent_wait_time_in_minutes_within_business_hours', 'requester_wait_time_in_minutes', 'requester_wait_time_in_minutes_within_business_hours', 'reservation_code', 'requires_manual_closing']
def self.perform(url)
`rm /tmp/zendesk_tickets*`
`wget #{url} -O /tmp/zendesk_tickets.csv.zip`
`unzip -p /tmp/zendesk_tickets.csv.zip > /tmp/zendesk_tickets.csv`
# Another attempt using the built in Rails.cache.fetch (better serialization/deserialization)
class ActiveSupport::Cache::DalliStore
def fast_fetch(name, options=nil)
options ||= {}
name = expanded_key name
dupe_name = name+'_dupe'
if block_given?
unless options[:force]
vagrant@reddit:~$ sudo reddit-stop
vagrant@reddit:~$ sudo reddit-start
vagrant@reddit:~$ ps -ef | grep reddit
rabbitmq 1408 1400 1 02:11 ? 00:00:30 /usr/lib/erlang/erts-5.10.4/bin/beam.smp -W w -K true -A30 -P 1048576 -- -root /usr/lib/erlang -progname erl -- -home /var/lib/rabbitmq -- -pa /usr/lib/rabbitmq/lib/rabbitmq_server-3.2.4/sbin/../ebin -noshell -noinput -s rabbit boot -sname rabbit@reddit -boot start_sasl -kernel inet_default_connect_options [{nodelay,true}] -sasl errlog_type error -sasl sasl_error_logger false -rabbit error_logger {file,"/var/log/rabbitmq/rabbit@reddit.log"} -rabbit sasl_error_logger {file,"/var/log/rabbitmq/rabbit@reddit-sasl.log"} -rabbit enabled_plugins_file "/etc/rabbitmq/enabled_plugins" -rabbit plugins_dir "/usr/lib/rabbitmq/lib/rabbitmq_server-3.2.4/sbin/../plugins" -rabbit plugins_expand_dir "/var/lib/rabbitmq/mnesia/rabbit@reddit-plugins-expand" -os_mon start_cpu_sup false -os_mon start_disksup false -os_mon start_memsup false -mnesia dir "/var/lib/rabbitmq/mnesi
vagrant@reddit:~$ set -e
vagrant@reddit:~$
vagrant@reddit:~$ apt_get=/usr/bin/apt-get
vagrant@reddit:~$ ln=/bin/ln
vagrant@reddit:~$ wget=/usr/bin/wget
vagrant@reddit:~$ service=/usr/sbin/service
vagrant@reddit:~$ reddit_run=/usr/local/bin/reddit-run
vagrant@reddit:~$ init_ctl=/sbin/initctl
vagrant@reddit:~$
vagrant@reddit:~$ export REDDIT_USER=vagrant
$ vagrant up
Bringing machine 'default' up with 'virtualbox' provider...
==> default: Importing base box 'ubuntu/trusty64'...
==> default: Matching MAC address for NAT networking...
==> default: Checking if box 'ubuntu/trusty64' is up to date...
==> default: Setting the name of the VM: reddit-vagrant_default_1446339977960_21381
==> default: Clearing any previously set forwarded ports...
==> default: Clearing any previously set network interfaces...
==> default: Preparing network interfaces based on configuration...
default: Adapter 1: nat
<div class="coinbase-button" data-code="818bdfa54c1525d774440edbd267ee60"></div>
<script src="https://coinbase.com/assets/button.js" type="text/javascript"></script>
require 'sunspot'
require 'mongoid'
require 'sunspot/rails'
class Post
include Mongoid::Document
field :title
include Sunspot::Mongoid
searchable do
text :title
@barmstrong
barmstrong / gist:1302842
Created October 21, 2011 01:03
Ruby script to connect to New Relic and turn on our flame lamp!
#!/usr/bin/env ruby
## as seen on http://nerds.airbnb.com/monitoring-your-serverswith-fire
THRESHOLD = 1000 # milliseconds
flame_on = false
def log ms, msg
puts "#{Time.now.to_s} \t #{ms}ms \t #{msg}"
end
@barmstrong
barmstrong / 20110629202502_create_oauth2s.rb
Created June 30, 2011 01:38
sample code for setting up the Google Prediction API in a background process using OAuth2 and the google-api-ruby-client
class CreateOauth2s < ActiveRecord::Migration
def self.up
create_table :oauth2s do |t|
t.string :api
t.string :refresh_token
t.string :access_token
t.datetime :expires_at
t.timestamps
end
@barmstrong
barmstrong / gist:1030817
Created June 17, 2011 03:33
Ruby file to use the Google Prediction API, with a very hacked OAuth2
# Ruby file to use the Google Prediction API, with a very hacked OAuth2
# You'll want to replace all the custom variables including..
# 1. Your google storage bucket name
# 2. Your google storage access credentials (note that gstore only works with "legacy" google storage access so you'll need to enabled this)
# 3. Your OAuth credentials which you setup from here https://code.google.com/apis/console/ by selecting "API Access"
# Note that I choose "Create client ID" and then "Installed Application".
#
# This script is intended to be run as a regular background process (like a cron job) to process data. It has no access to a browser and no web server to expose a callback url. Hence the hacking of OAuth2. This seems completely wrong to me but I haven't gotten any other authentication with the API to work. If anyone knows a better way please post a comment!
#