Skip to content

Instantly share code, notes, and snippets.

Avatar

Sharon Machlis smach

View GitHub Profile
@smach
smach / election_night_live_model.R
Created Oct 31, 2020 — forked from elliottmorris/election_night_live_model.R
A live election-night prediction model using The Economist's pre-election forecast
View election_night_live_model.R
#' Description
#' This file runs a live election-night forecast based on The Economist's pre-election forecasting model
#' available at projects.economist.com/us-2020-forecast/president.
#' It is resampling model based on https://pkremp.github.io/update_prob.html.
#' This script does not input any real election results! You will have to enter your picks/constraints manually (scroll to the bottom of the script).
#'
#' Licence
#' This software is published by *[The Economist](https://www.economist.com)* under the [MIT licence](https://opensource.org/licenses/MIT). The data generated by *The Economist* are available under the [Creative Commons Attribution 4.0 International License](https://creativecommons.org/licenses/by/4.0/).
#' The licences include only the data and the software authored by *The Economist*, and do not cover any *Economist* content or third-party data or content made available using the software. More information about licensing, syndication and the copyright of *Economist* content can be fou
View dual-color-reactable.R
library(reactable)
library(dplyr)
red_pal <- function(x) rgb(colorRamp(c("#FFCDD2FF", "#C62828FF"))(x), maxColorValue = 255)
blue_pal <- function(x) rgb(colorRamp(c("#BBDEFBFF", "#1565C0FF"))(x), maxColorValue = 255)
mtcars %>%
select(cyl, mpg) %>%
reactable(
pagination = FALSE,
@smach
smach / geocode_courthouses.R
Last active Dec 24, 2019
Code to geocode courthouse list
View geocode_courthouses.R
# 1. Scrape tables. You can do that with the Table Capture Chrome extension, or you can do it with R.
# Download the entire html document so I don't need to keep hammering the Wikipedia server
library(htmltab)
library(rvest)
library(purrr)
library(dplyr)
@smach
smach / TidyTuesdayAltChallenge.R
Last active Dec 11, 2019
Reshaping some very messy data
View TidyTuesdayAltChallenge.R
library(dplyr)
library(tidyr)
library(janitor)
library(stringr)
starwars_garbage_data1 <- data.frame(
stringsAsFactors = FALSE,
v1 = c(
"Character Name", "C-3PO", "Person-film ID", "2218529825", "7731900678",
"123598423", "238952395", "6232048034", "3036308047",
@smach
smach / ImportZipCodes.R
Created Jun 20, 2019
See how readr imports Zip Codes compared with base R
View ImportZipCodes.R
price_median1 <- readr::read_csv("https://raw.githubusercontent.com/smach/BU-dataviz-workshop-2019/master/data/zillow_data_median_sf_price.csv")
str(price_median1)
price_median2 <- read.csv("https://raw.githubusercontent.com/smach/BU-dataviz-workshop-2019/master/data/zillow_data_median_sf_price.csv")
str(price_median2)
View app.R
if(!require(pacman)){
install.packages("pacman")
}
p_load(ggplot2, dplyr, janitor)
district <-c("A","B","C","A","B", "C")
money <-c(500,324,245,654,234, 232)
year <- c("2001", "2001", "2001", "2002", "2002", "2002")
df <- data.frame(district, money, year, stringsAsFactors = FALSE)
total_by_year <- df %>%
group_by(year) %>%
View import_fixedwidth_file.R
# Survey data file downloaded manually from the CDC https://www.cdc.gov/brfss/annual_data/annual_2017.html
surveydatafile <- "LLCP2017.ASC"
# I used datapasta package's df_paste() function to create the data frame below,
# after copying the metadata table into my clipboard. Metadata is here:
# https://www.cdc.gov/brfss/annual_data/2017/llcp_varlayout_17_onecolumn.html
column_meta_data <- data.frame(stringsAsFactors=FALSE,
Starting.Column = c(1L, 17L, 19L, 19L, 21L, 23L, 32L, 36L, 36L, 63L, 64L,
65L, 66L, 67L, 68L, 69L, 71L, 73L, 75L, 76L, 77L, 78L,
View pelias_test.geojson
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@smach
smach / GitHub-Forking.md
Created Dec 11, 2017 — forked from Chaser324/GitHub-Forking.md
GitHub Standard Fork & Pull Request Workflow
View GitHub-Forking.md

Whether you're trying to give back to the open source community or collaborating on your own projects, knowing how to properly fork and generate pull requests is essential. Unfortunately, it's quite easy to make mistakes or not know what you should do when you're initially learning the process. I know that I certainly had considerable initial trouble with it, and I found a lot of the information on GitHub and around the internet to be rather piecemeal and incomplete - part of the process described here, another there, common hangups in a different place, and so on.

In an attempt to coallate this information for myself and others, this short tutorial is what I've found to be fairly standard procedure for creating a fork, doing your work, issuing a pull request, and merging that pull request back into the original project.

Creating a Fork

Just head over to the GitHub page and click the "Fork" button. It's just that simple. Once you've done that, you can use your favorite git client to clone your repo or j

@smach
smach / google_calendar_demo.R
Last active Oct 2, 2017 — forked from MarkEdmondson1234/google_calendar_demo.R
A demo of calling Google Calendar API
View google_calendar_demo.R
library(googleAuthR)
## set scopes for calendar
options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/calendar.readonly",
googleAuthR.client_id = "XXXX", ## add your Google project client Id - find it at https://console.developers.google.com/apis/credentials then click on the appropriate OAuth 2.0 client ID
googleAuthR.client_secret = "XXXX") ## add your Google project client secret - at same place as above
## make sure calendar API is activated for your Google Project at below URL:
# https://console.cloud.google.com/apis/api/calendar-json.googleapis.com/overview
You can’t perform that action at this time.