Skip to content

Instantly share code, notes, and snippets.

@mrflip
mrflip / quickie_json_to_tsv
Created March 11, 2011 01:22
Here's a brittle but mostly-worky oneliner to turn a search result into an excel-loadable file
# run this in Terminal
curl "http://api.infochimps.com/social/network/tw/search/people_search?q=qualcomm&apikey=YOURAPIKEY" | ruby -rubygems -e 'require "json" ; $_ = $stdin.read.gsub(%r{^george_api_explorer\((.*)\)$}, "\\1") ; rows = JSON.load($_); ks= rows["results"].first.keys ; ks.sort! ; rows["results"].each{|row| puts row.values_at(*ks).map{|el| el.gsub(%r{[",]+},"") }.join(",") } ' > FILE_TO_SAVE_IN.csv
@jt
jt / phone_number_regex.rb
Created June 5, 2011 19:49
Phone number regular expression pattern
def match(number)
/\+?(\d)?[-|\.|\s]?\(?(\d{3})\)?[-|\.|\s]?(\d{3})[-|\.|\s]?(\d{4})/.match number
end
# test match and captures against possibilities
[
['3335557777', nil, '333', '555', '7777'],
['333-555-7777', nil, '333', '555', '7777'],
['333.555.7777', nil, '333', '555', '7777'],
['333 555 7777', nil, '333', '555', '7777'],
@pauldix
pauldix / gist:2830675
Created May 29, 2012 21:00
ideas for fetching
# set the thread pool size and the timeout in seconds.
fetcher = Feedzirra::Fetcher.new({:thread_pool_size => 100, :timeout => 5})
# some feed data objects. also include feed_entry data objects
feeds = [Feed.new({:entries => [], etag => "..", :last_modified => "...", :url => "...", :feed_url => "...", :title => ""})]
# async style
fetcher.get(feeds, :on_success => lambda {|feed, updated_feed| ...}, :on_failure => lambda {|feed, failure_object| ...})
# that returns before finishing fetching the feeds. just adds them to a thread-safe queue to be processed by a worker pool.
# the failure condition could actually call fetcher.get on the failed feed again if you wanted to retry.
16:19 t432: What is the best way to approach the following problem.... A user logs in, application connects to database, retrieves latest 100 records and display the content. if user scrolls to bottom of page, retrieve next 100 records and display continue till end. In addition if a new record is created whilst logged in automatically display record on top... Might look familiar you use twitter
16:19 mattgordon has joined (~mattgordo@208.66.31.98)
16:20 will_: Great :)
16:20 Psi-Jack: I'm also using the primary_conninfo line in recovery.conf.
16:20 Psi-Jack: And trigger_File.
16:20 Psi-Jack: But, it's simply not streaming. :)
16:20 Psi-Jack: I'
16:20 luckyruby: t432, what web framework are u using?
16:21 orf_ has left IRC (Quit: Leaving)
16:21 bwlang_ has joined (~anonymous@70-91-134-14-ma-ne.hfc.comcastbusiness.net)
@bsingr
bsingr / actor.rb
Created August 26, 2012 16:55
Different methods to spawn Celluloid Actors.
# instantiate a cat object named 'Garfield' within its own actor thread
cat = Cat.new 'Garfield'
# blocking (the main thread waits until the cat has finished..)
cat.spray
# non-blocking
cat.spray!
# cats do what cats do
@roryokane
roryokane / .rbenv-version
Created September 4, 2012 12:31
editing Wikipedia ISBN calculation code
1.9.3-p194

Changing the Public API

When I started looking for ways to help on the "github.com/alecthomas/gozmq" package a few months ago, a recurring topic in discussions was that an earlier decision to provide a hidden struct type through a public interface type had become restrictive. We couldn't just drop the interface and make the struct public because that would be backwards-incompatible. Or could we?

No

This had to be the first considered option, and was the standing decision at the time. If we could continue providing what the package aims to provide without breaking backwards compatibility, then by all means, we should not break it.

Unfortunately this meant we couldn't provide all that we aimed to provide. Or could we?

@karlseguin
karlseguin / dnscache.go
Created June 12, 2013 07:43
Cache DNS responses and refresh them using a single goroutine on a 1 minute timer. Avoids having a spike of threads from cgo launched under load.
package dnscache
import (
"net"
"sync"
"time"
"math/rand"
)
var (
@creationix
creationix / run.js
Last active March 7, 2017 18:36
universal callback/continuable/thunk generator runner
function run(generator) {
// Pass in resume for no-wrap function calls
var iterator = generator(resume);
var data = null, yielded = false;
next();
check();
function next(item) {
var cont = iterator.next(item).value;
@jstorimer
jstorimer / port_scanner.rb
Created August 30, 2012 03:40
Simple, parallel port scanner in Ruby built with connect_nonblock and IO.select.
require 'socket'
# Set up the parameters.
PORT_RANGE = 1..512
HOST = 'archive.org'
TIME_TO_WAIT = 5 # seconds
# Create a socket for each port and initiate the nonblocking
# connect.
sockets = PORT_RANGE.map do |port|