This is a simple chat-like program using pub-sub pattern, backed by PostgreSQL's LISTEN/NOTIFY command.
publish message to foo
channel from user nickname
.
$ python pub.py foo nickname
PUBLISH to channel #foo
✘ ~ cat .config/systemd/user/swayidle.service | |
[Unit] | |
Description=Idle manager for Wayland | |
Documentation=man:swayidle(1) | |
PartOf=graphical-session.target | |
[Service] | |
Type=simple | |
ExecStart=/usr/bin/swayidle -w \ | |
timeout 900 'swaymsg "output * dpms off"' \ |
#!/usr/bin/env ruby | |
require 'pg' | |
=begin | |
when creating a postgres db replica in RDS, or restoring from a snapshot, the | |
underlying EBS volume of the new instance must be initialized by reading every | |
block, otherwise the blocks will be lazy-initialized by production queries | |
which will be extremely latent. (i've seen normally 50ms queries take 30s in |
#!/usr/bin/env bash | |
# run this script from your home folder | |
# sudo bash | |
curl -O http://downloads.mongodb.org/linux/mongodb-linux-x86_64-2.6.12.tgz | |
tar -zxvf mongodb-linux-x86_64-2.6.12.tgz | |
cp mongodb-linux-x86_64-2.6.12/bin/* /usr/local/bin | |
groupadd mongodb | |
useradd --system --no-create-home -g mongodb mongodb |
# most people include something like this. don't. | |
# check your default nginx.conf, it's already covered in a much better way. | |
#gzip_disable "MSIE [1-6]\.(?!.*SV1)"; | |
# compress proxied requests too. | |
# it doesn't actually matter if the request is proxied, we still want it compressed. | |
gzip_proxied any; | |
# a pretty comprehensive list of content mime types that we want to compress | |
# there's a lot of repetition here because different applications might use different |
# ActiveRecord's find_each and find_in_batches ported to Sequel. | |
# Sequel's paged_each is not practical when converting large data due to its use of transaction and offset. | |
# | |
# Usage: | |
# | |
# SequelEachInBatches.find_each(dataset, keys) { |record| ... } | |
# | |
# It can also monkey patch Sequel::Dataset: | |
# | |
# Sequel::Dataset.send(:include, SequelEachInBatches) |
while true; do date; sleep 5; done |
class Integer | |
N_BYTES = [42].pack('i').size | |
N_BITS = N_BYTES * 16 | |
MAX = 2 ** (N_BITS - 2) - 1 | |
MIN = -MAX - 1 | |
end | |
p Integer::MAX #=> 4611686018427387903 | |
p Integer::MAX.class #=> Fixnum | |
p (Integer::MAX + 1).class #=> Bignum |
require 'sinatra' | |
require 'json' | |
require 'csv' | |
# Serve data as JSON | |
get '/hi/:name' do | |
name = params[:name] | |
content_type :json | |
{ :message => name }.to_json | |
end |
require 'benchmark' | |
class DSL1 | |
def initialize | |
@i = 1 | |
end | |
def self.register(&block) | |
@block = block |