Skip to content

Instantly share code, notes, and snippets.

@lifewinning
lifewinning / predictive_policing_bibliography.md
Last active March 17, 2023 21:44
Recommended Reading/Bibliography on Predictive Policing/Sorry and Thanks

OK. So I wrote some article about predictive policing. And it had lots of links, and it turns out sometimes editors see your links and are like ¯_(ツ)_/¯ to keeping those links when they publish the article. This is probably in part my own fault because I should have included those links as footnotes, and because working in print vs working online is Different, etc etc--the point is I feel really rotten that lots of really good reporting wasn't given due credit. Putting this list together does not totally fix it but will hopefully help people interested in the topic.

Also: there is a part at the end of this gist that explains my mixed feelings about the piece in general. The citations are probably more important, but if you want to watch me tableflip a little scroll down.


@kosamari
kosamari / _.R.js
Last active December 7, 2018 11:12
Underscore mixin to enable core R functions
_.mixin({
sum : function(data){
return _.reduce(data, function(memo, num){ return memo + num; }, 0);
},
mean : function(data){
return this.sum(data)/data.length
},
median : function(data){
return this.percentile(data,50);
},
@mhkeller
mhkeller / gist:3834498
Created October 4, 2012 15:45 — forked from abelsonlive/gist:3751902
google geocoding API, R
Forked from @BrianAbelson
library('rjson') # For parsing json in R
library('RCurl') # For sending an http request
library('plyr') # For ddply
# Takes an data frame with a unique id column and an address column at the very least
# it returns a data frame with the uid, lat, lng and specificity of geocoding
# use the uid to join these results back to your data if you want
# you can also accomplish roughly the same thing with a for loop instead of ddply
@abelsonlive
abelsonlive / srapeshell.R
Created September 23, 2012 09:19
# best practices for web scraping in R // ldply
# best practices for web scraping in R #
# function should be used with ldply
# eg:
ldply(urls, scrape)
# add a try to ignore broken links/ unresponsive pages
# eg: