Skip to content

Instantly share code, notes, and snippets.

@liaoyw
liaoyw / robust_run.py
Created November 19, 2014 05:05
retry function for `retry` times
import functools
import sys
def robust(expected, retry=3):
def wrapper(func):
@functools.wraps(func)
def robust_run(*args, **kw):
n = 0;
result = None
while n < retry:
import os
"""
as bash command: du -sb dir
+4906: add 4096bytes for the top dir
"""
def du_sb(dir):
return sum(os.path.getsize(f) for f in reduce(lambda x,y:x+y, [[os.path.join(w[0], fe) for fe in w[1] + w[2]] for w in os.walk(dir)])) + 4096;
def get_total_file_size(dir):
return sum(os.path.getsize(f) for f in reduce(lambda x,y:x+y, [[os.path.join(w[0], fe) for fe in w[1] + w[2]] for w in os.walk(dir)]) if os.path.isfile(f))

tmux shortcuts & cheatsheet

start new:

tmux

start new with session name:

tmux new -s myname
# copy from http://www.tldp.org/LDP/abs/html/sample-bashrc.html
function extract() # Handy Extract Program
{
if [ -f $1 ] ; then
case $1 in
*.tar.bz2) tar xvjf $1 ;;
*.tar.gz) tar xvzf $1 ;;
*.bz2) bunzip2 $1 ;;
*.rar) unrar x $1 ;;
*.gz) gunzip $1 ;;
# encoding: utf-8
import zlib
"""
copy from:
https://github.com/afirel/consistent_hashr/blob/master/lib/consistent_hashr.rb
http://www.tom-e-white.com/2007/11/consistent-hashing.html
"""
class ConsistentHash(object):
# open macdown from terminal
# from https://github.com/MacDownApp/macdown/wiki/Advanced-Usage
macdown() {
"$(mdfind kMDItemCFBundleIdentifier=com.uranusjr.macdown | head -n1)/Contents/SharedSupport/bin/macdown" $@
}
ping cn.bing.com | while read pong; do echo "$(date): $pong"; done
@liaoyw
liaoyw / latency.markdown
Created October 9, 2016 01:10 — forked from hellerbarde/latency.markdown
Latency numbers every programmer should know

Latency numbers every programmer should know

L1 cache reference ......................... 0.5 ns
Branch mispredict ............................ 5 ns
L2 cache reference ........................... 7 ns
Mutex lock/unlock ........................... 25 ns
Main memory reference ...................... 100 ns             
Compress 1K bytes with Zippy ............. 3,000 ns  =   3 µs
Send 2K bytes over 1 Gbps network ....... 20,000 ns  =  20 µs
SSD random read ........................ 150,000 ns  = 150 µs

Read 1 MB sequentially from memory ..... 250,000 ns = 250 µs

git_branch() {
gb=$(git branch 2>/dev/null | grep '^*' | tr -d '* ')
[[ ! -z $gb ]] && echo "($gb)"
}
export PS1="\\W\$(git_branch) $"