Skip to content

Instantly share code, notes, and snippets.

View abresler's full-sized avatar

Alex Bresler abresler

View GitHub Profile
## Warren Buffet's 1B Basketball Challenge ##
expected_value <- function(p,ngames=63,prize=1000000000){
p^ngames * prize
}
## What is the expected value of an entry
## given a particular level of prediction accuracy
expected_value(p=0.80)
expected_value(p=0.85)
Tables Are Like Cockroaches
========================================================
As much as I would like to completely replace all tables with beautiful, intuitive, and interactive charts, tables like cockroaches cannot be eliminated. Based on this very [interesting discussion on the Perceptual Edge forum](http://sfew.websitetoolbox.com/post/Old-or-oldest-data-table-5702966) with source [Exploring the Origins of Tables for Information Visualization](http://api.viglink.com/api/click?format=go&key=ccc2217bc2b75c4bdf5f7e57267f0464&loc=http%3A%2F%2Fsfew.websitetoolbox.com%2Fpost%2FOld-or-oldest-data-table-5702966&v=1&libid=1365001395530&out=http%3A%2F%2Fcsis.pace.edu%2F~marchese%2FPapers%2FIV11%2FMarchese_IV%252711.pdf&ref=http%3A%2F%2Fwww.google.com%2Furl%3Fsa%3Dt%26rct%3Dj%26q%3D%26esrc%3Ds%26frm%3D1%26source%3Dweb%26cd%3D2%26ved%3D0CDoQFjAB%26url%3Dhttp%253A%252F%252Fsfew.websitetoolbox.com%252Fpost%252FOld-or-oldest-data-table-5702966%26ei%3DpERcUcvxPImY9QTKi4GwCg%26usg%3DAFQjCNFppNIFLhtMVC3W9Xklbffh8lAm
@abresler
abresler / twitter_search.R
Created December 9, 2012 02:36 — forked from dsparks/twitter_search.R
Twitter search
doInstall <- TRUE
toInstall <- c("twitteR", "lubridate")
if(doInstall){install.packages(toInstall, repos = "http://cran.us.r-project.org")}
lapply(toInstall, library, character.only = TRUE)
searchTerms <- c("New York", "Los Angeles", "Chicago", "Houston", "Philadelphia",
"Phoenix", "San Antonio", "San Diego", "Dallas", "San Jose",
"Jacksonville", "Indianapolis", "Austin", "San Francisco",
"Columbus", "Fort Worth", "Charlotte", "Detroit", "El Paso",
"Memphis")
@abresler
abresler / Background_lines.R
Created October 27, 2012 20:23 — forked from dsparks/Background_lines.R
Faceted plots with reference geometries
# Better alternatives to 3-D
# Here are some examples of what we don't want to do:
# http://stackoverflow.com/questions/12765497/create-a-ribbon-plot
doInstall <- TRUE # Change to FALSE if you don't want packages installed.
toInstall <- c("ggplot2")
if(doInstall){install.packages(toInstall, repos = "http://cran.r-project.org")}
lapply(toInstall, library, character.only = TRUE)
# Air passenger data. ts converted to long matrix:
@abresler
abresler / price_and_volume.R
Created October 27, 2012 20:10 — forked from dsparks/price_and_volume.R
Scraping and plotting InTrade data
# Scraping and plotting InTrade data
doInstall <- TRUE # Change to FALSE if you don't want packages installed.
toInstall <- c("ggplot2", "lubridate")
if(doInstall){install.packages(toInstall, repos = "http://cran.r-project.org")}
lapply(toInstall, library, character.only = TRUE)
# Fin the "contractId" of the contract in which you're interested
# Let's try Barack Obama to win the third Presidential debate:
# http://www.intrade.com/v4/markets/contract/?contractId=766623
contractID <- 766623 # second debate is 766621
## Download websites
addresses <- read.csv("~/links.full.csv")
for (i in addresses){
fed.text <- getURL(i)
}
fed.df <- as.data.frame(fed.text)
## Download websites
addresses <- read.csv("~/links.full.csv")
for (i in addresses){
fed.text <- getURL(i)
}
fed.df <- as.data.frame(fed.text)
@abresler
abresler / gist:3249788
Created August 3, 2012 17:27 — forked from christophergandrud/gist:1287427
Google Motion Chart of World Bank External Debt Data
# Create Google Motion Chart from World Bank Finance Data
# Inspired by http://lamages.blogspot.com/2011/09/accessing-and-plotting-world-bank-data.html
# Written by Christopher Gandrud
# 15 October 2011
library(WDI)
library(gregmisc)
library(googleVis)
@abresler
abresler / gist:3249776
Created August 3, 2012 17:26 — forked from christophergandrud/gist:1284498
Simple Web Crawler for Text
library(foreign)
library(RCurl)
addresses <- read.csv("~/links.csv") # Create a .csv file with all of the links you want to crawl
for (i in addresses) full.text <- getURL(i)
text.sub <- gsub("<.+?>", "", full.text) # Removes HTML tags
text <- data.frame(text.sub)