Skip to content

Instantly share code, notes, and snippets.

@johnburnmurdoch
johnburnmurdoch / index.html
Created October 24, 2018 13:24
d3: translating text and rotating in-situ
<!DOCTYPE html>
<head>
<meta charset="utf-8">
<script src="https://unpkg.com/d3@5"></script>
<script src="https://unpkg.com/d3-selection-multi"></script>
<style>
* {
margin: 0;
padding: 0;
border: 0;
# Data is the UN's Medium-variant population projections, available at https://population.un.org/wpp/
data %>%
filter(Sex != "Both" & A3 %in% c("GBR", "RUS", "IND", "CHN", "RWA", "GRC") & Year %in% 2018:2060) %>%
as.tibble %>%
mutate(
group = paste0(Year, Sex), AgeGrp = as.numeric(AgeGrp),
Location = Location %>% gsub("n Federation","",.)
) %>%
ggplot(aes(AgeGrp, Value, col=Sex, group=group)) +
@johnburnmurdoch
johnburnmurdoch / index.html
Last active February 4, 2019 14:48
Animating 50,000 points with WebGL
<!doctype html>
<html class="no-js" lang="">
<head>
<meta charset="utf-8">
<title>Animating 50,000 points with WebGL</title>
<meta name="description" content="Animating 50,000 points with WebGL">
<meta name="viewport" content="width=device-width, initial-scale=1">
<script src=https://cdnjs.cloudflare.com/ajax/libs/pixi.js/4.0.0/pixi.js></script>
</head>
<body>
#### All thanks to the wonderful Tyler Morgan-Wall for creating the fantastic Rayshader package ####
devtools::install_github("tylermorganwall/rayshader")
install.packages("needs")
needs(tidyverse, rayshader, magrittr, raster)
# Cape Town DEM downloaded from the City of Cape Town Open Data Portal: http://web1.capetown.gov.za/web1/OpenDataPortal/DatasetDetail?DatasetName=Digital%20elevation%20model
unzip("~/Downloads/10m_Grid_GeoTiff.zip")
CT <- raster("10m_Grid_GeoTiff/10m_BA.tif")
# plot the raster to make sure everything looks okay, and to get an idea of what area it covers
plot(CT)
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
value name year
NA 3M 2000
NA 3M 2001
NA 3M 2002
NA 3M 2003
NA 3M 2004
NA 3M 2005
NA 3M 2006
NA 3M 2007
NA 3M 2008
@johnburnmurdoch
johnburnmurdoch / currency.csv
Last active March 28, 2019 10:18
currency.csv
value time
1.34944 2017-12-31 23:00:00
1.35889 2018-01-01 23:00:00
1.35142 2018-01-02 23:00:00
1.35504 2018-01-03 23:00:00
1.3564 2018-01-04 23:00:00
1.3564 2018-01-05 23:00:00
1.3564 2018-01-06 23:00:00
1.35662 2018-01-07 23:00:00
1.35395 2018-01-08 23:00:00
LANDSAT 8
1 39084U 13008A 19096.49276745 .00000042 00000-0 19423-4 0 9994
2 39084 98.1930 167.4492 0001375 87.8678 272.2685 14.57117477326927
We can't make this file beautiful and searchable because it's too large.
name,value,year,lastValue
3M,NA,2000,NA
3M,NA,2000.1,NA
3M,NA,2000.2,NA
3M,NA,2000.3,NA
3M,NA,2000.4,NA
3M,NA,2000.5,NA
3M,NA,2000.6,NA
3M,NA,2000.7,NA
3M,NA,2000.8,NA
# Load the packages we’re going to be using:
# Alongside the usual stuff like tidyverse and magrittr, we’ll be using rvest for some web-scraping, jsonline to parse some JSON, and extrafont to load some nice custom fonts
needs(tidyverse, magrittr, rvest, jsonlite, extrafont)
# Before we go on, two things to note:
# First, on web scraping:
# You should always check the terms of the site you are extracting data from, to make sure scraping (often referred to as `crawling`) is not prohibited. One way to do this is to visit the website’s `robots.txt` page, and ensure that a) there is nothing explicitly stating that crawlers are not permitted, and b) ideally, the site simply states that all user agents are permitted (indicated by a line saying `User-Agect: *`). Both of those are the case for our use-case today (see https://www.ultimatetennisstatistics.com/robots.txt).
# And second, about those custom fonts: