This gist is part of a blog post. Check it out at:
http://jasonrudolph.com/blog/2011/08/09/programming-achievements-how-to-level-up-as-a-developer
# usage: redfin-images "http://www.redfin.com/WA/Seattle/123-Home-Row-12345/home/1234567" | |
function redfin-images() { | |
wget -O - $1 | grep "full:" | awk -F \" '{print $4}' | xargs wget - | |
} |
This gist is part of a blog post. Check it out at:
http://jasonrudolph.com/blog/2011/08/09/programming-achievements-how-to-level-up-as-a-developer
#!/usr/local/bin/python | |
import os | |
import sys | |
import csv | |
from operator import itemgetter | |
from bs4 import BeautifulSoup | |
from anki import Collection as aopen | |
usmle_rx = open("/Users/drlulz/Desktop/test.html",'r').read() |
"""Basic HyperLogLog: | |
This implementation is from the website, | |
https://github.com/Parsely/python-pds, but has been modified to remove | |
the dependency on the "smhasher" module and so that it can be run | |
using the Anaconda Python distribution. | |
To find the modifications, look for inline comments that begin with '# !!! ' | |
""" |
Picking the right architecture = Picking the right battles + Managing trade-offs
FWIW: I (@rondy) am not the creator of the content shared here, which is an excerpt from Edmond Lau's book. I simply copied and pasted it from another location and saved it as a personal note, before it gained popularity on news.ycombinator.com. Unfortunately, I cannot recall the exact origin of the original source, nor was I able to find the author's name, so I am can't provide the appropriate credits.
package org.apache.spark.ml.feature | |
import org.apache.spark.ml.linalg.BLAS.axpy | |
import org.apache.spark.ml.linalg._ | |
import org.apache.spark.rdd.RDD | |
import org.apache.spark.sql.SparkSession | |
import scala.util.Random | |
/** |
from keras.layers.core import Permute | |
from keras.layers import Dense, Activation, RepeatVector, merge,Flatten, TimeDistributed, Input | |
from keras.layers import Embedding, LSTM | |
from keras.models import Model | |
hidden = 225 | |
features = get_features() | |
outputs = get_outputs() |
import torch as th | |
class NLL_OHEM(th.nn.NLLLoss): | |
""" Online hard example mining. | |
Needs input from nn.LogSotmax() """ | |
def __init__(self, ratio): | |
super(NLL_OHEM, self).__init__(None, True) | |
self.ratio = ratio |